var/home/core/zuul-output/0000755000175000017500000000000015157630776014546 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015157635456015512 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000124046315157635307020274 0ustar corecore:ikubelet.log][oȒ~_!e_6ټ01&vI Ȗ$EfV7)YNj93'%Y/U_UW]jDȶOԋFIuݰXGSF˦)ӓsq>ih}[4ᨬ$kYRh.Q.*63 GG|g&ҶR}&[D~Z*j1o329 #:M WY sYfVʫϦeUlɺҒ*EjTtg.Ӄsu]xt~gA;Yƒ_XմHg~}ѼwfDy>lwsǞU qJetWSVFQ '@L2F0˛!Y-6 _ MuO#8ۧ3*U q.XOx#ѯ|el\8L [ 4ˮ\gMQ<`@WT}뢒s\] \-,ܠym#r/$}C۶DA_MSa0? tGhw06koAL`5w}W/urhiAKQ kZ,[vw)}rVOEK` ^y?7kU2-bҮvԫOp<{)K Jm)NPoWݛ-37y0z,}tyEAW8 >a ІJ8(1b8h^fm;okEʦ \I_QуS9ks}``0gc--< rŌJ ڎdٟ&@n/8]0( Ak` %w ( g8M i!wW{޳dytKt(BBxЛؿ,V8(`M'=r> 9 &bm~0<폽/.XN6-m8"DOB8(!gA / ⃔̉7c&ͧ?p#p^I.{~WMq`짐 JU$]z6 54SV \.D8OI*O*Mais82Ha>92n?EQI+D- `Njp"dy4כgC `@LA%=TK"2Nzq;Dfv)qXXMǃaΉ둁%B\ĝ"ɥ;x q׏0pHH`S H!y[7Oz(Hx8{s)83нOU h[O&;hD~"z(pPҒ5Nphvs6BV'm:眔)[bs+ȕ{ `Ύp|gQb},$ a+)-Is,a# @'MVZɞDNy}PD~qdC>AN$ED+on MW?E ( +j0'7v+WZH%Jf'IŅ-P'(vLaVlqvtvHɡ^2+Und ꎣ[ek%* x(b.R5Rpw!.!ϕj u#FjZuId;nWyBN݅T-4lX)Xj BfbZSHj#ƒ5~,$ٷu4񜾧Y֊_B#i)]J׳ +N#k)GɄ lQ͚39򻟌" vwI#%q=ICL6]Tpo%Qw_}m2<~pOdۊYMEwώf|&o'խ{X X1L DGq{*Fji/ Q=wdɠ$:JF:V"*yž65P3L`,E[[dr hFZ]^R^>Z뤕!Yr#/U֌C !9H~efƶN`V٨]Y>%ni2kE^W4,4i]]HD-+w˅lxtPFr%XKXZRd`tD8ڶ3qH#v(FT] Z9˓" -zE~EKn#uhO=i5TNɪH ǰ1kqweYv}F$=f[h^6JmTP`zy *C3K%2 -i›kmNhr m]a%c@ FμbY[/wmi OR3)s8NOx6Sr Q[j=agv=%f]$+]>pJHs ﮚëlug:k7pty`9!"\jm$ Nda"ҾrZ92t,+BO6l-˱W*jSdX!Kzg}H_'R^,T| :Yͺ!\]VoTOL?&^J =u-RXOK$ΠGDRQ^:V^yN.[c{] w["1:r%e5xbmH ~@U֙A05Kȸ;Nfid/\Yf tGwoC?<@A:;;ڄ(purgG*$yQ= @IN$T`ge-l3Ff~#O▾9@ʚp@vL :@fȋ0̶|,{z{Ks@k{ LڅUQ)/ ׫& ң."D%?%H}{/B!~L:]ρ57\%T5R/x zUp@lQUd/&28o"eoYeO,Ghm Q+F^w; MXuIm%-FrtVA/aAF`:|dT&~LDY?DQ~F2 I3chÁ80T}yNx)xq3v㶠c; ұ})rz;NЯ޸2a𿻏?c"#|sZ'J?聨8UwDF; K'.tgrOFw5_!\q.vH};n%E Krm$;ټol:.vےHfHÙz}z?:5f9`*O*"C[_{s<\,>}Ղl4u~,w7S Cꝝ`/?2׺_fz\_|C"#hJ9;٩%ou۲Y/[C&EsUWuK~6kVWPoTypD@#!pMFP[wAʹ.G9g"70xyfnV! %Hll*B\@ ©c:@da ۾.^81p'M>yw x*b^Qpgd%gWcgk s8/mE.94s>b7pkBwywaSTQę;"/0,No pZJJx"G q?~@a3N/<`^:76-O)$#t%I\YSBp([MhP"Gr "`}3B ȕF8ߍz%rAqGK a@<7% x-Q% !$,D)lxLb ^,)TFPk w&v(pFJVbGA]Ip`#؝Gp); 1En$EA(ݎPcW͟ k+n ;;v0w,IJp7"CG"NC lRn9*[6Du+ĕ`@P ̲B_Jp怯+ǩ=&6d}paʱh!zbm>C (3&CnKYyM6ѭqԡ۵¡Z؞ ! "cۮ&pd4.5WG;[ P 69I?rۥ(b\IP&M6|Mi†V8wTmn3k.qU&seP+N+%v!p9^,Ip Aր3aYl9Rl@=,rH<UHB5B9bEDVD"1Yv2^"E;`KASYco)g 0B PSA8Y?W$rUjRm%6`]J}9}p$# }w-6moytMכ[*yH|q) i0ZROv=fT;ќ3 px{S^a }n/G8~K`fիMMfF!]TTD+@WY ۮ~TnHeiZ5e({4 ORsw-)`O0xY y )5CjҨG2I\bOI+~L|Q\$ad\g8#2ͪ zjTzI6nuc}|j?-Vk:*7w nش0cXmJ˙yGrNPˑ 7*~UKi(xYi7 $VSƛ0n-rxg8sezȤ&9u9S,*=Yo Jr.Z۬7V4°s<@CVb0$ 5KH[]L9LQGlR"&zr5aMX>%'5H߽lڦ0z '?Sqx5p#\V߽k6"|JgYk'e>[|I݀2>qF.qΌ="WlT2\1Ҁ^UKr kЃ V6'@%ȗ:LITp{tP^VxKsӅe=ig帰@|A+Q$'>eyD=a3/A[WWEA=|xچ$9O^Buy/dq҄ ^TZ^%k jNhQ^~r~/OQ7)%jذU{tؒF 6._ȂG5#_WtfՄՀs-,0VÅub‡wA4 t[U-oU>X)\[ .U*p0+>F@knn)m+*c7ݓHN @r|4guOW!ll ;9lц?e0(%]<]ւ"Ny8d@Lyeg3)RPbHEoPl PB|(a,)1腃a QP6eppA hX%XOQ |d zަ!I%ǬWЌB@3Კ8%V8>{Y,fK(̶x _,`wQ'": n.()c6P$H"ТDA@{l>3xqAXD4Քఖ>885~BL5L|mnE?Zd;&m\@}8J\KaCk[ ,%K^(Q>e Y~Ys9zvQ#cYݠ镟PHtA*?q*2r6-lꊢ](;LbFIJeeYXMZ[B΢}Q5H#Vjivg0OmRڿt@=PzvVpgqJ#}OHvo7WlzJ4bo!i(_0I|fjQ0-i J]E?w ԣbT,[Yb^yw ~Rk,I/Ƃ˫r2hgD]F~s f Orv XZg](#X). 9l2PN[PJ(ۀP9iG6!tK&}ѩ쉽ۭl~]؍z1]Ծ}tnһ*?Tl@h9 Ih*>𑄆m@h9Ih*ƛ?l@h9MIh|P|sB$o@X%Tl@؜P4B# wK8gD AY[څnC'H?QxL7z|Mdn '9칧ñ4EBwnjϑ\Wz>#EpO!Mt!< Gjٴ$kJ'n vNNivLD>gbwgaͣ}/G<yb,&e?1WHڒH!y{FD?+>QQ}ràY`P?1 Fw Suᴄbѫ1##;I$ɕ:.ZzH;-sSX<ِ5mwu ~`HS f<;#[7)C{M6an=X6xk.Be=<,Om!. n\lWFsI¿La~1}A?4Y΄U 4︚9fQMi>bӸ1z`c ['ڽaǃYox(I =~*M2p}?S);Rév&_ctUwk0(+7;zj^IU?W $XHE A^^ pLFruy{/Jn5Mf=z6ӍL,B=XQ"L)V` ׀&B_ [`c#'P2(KUCe Ӟw,cQ_3 m;NuTE;T#,ЏWk~.' *S=}m" 6+p@HڛI.uE5 As@ld7{s@*)_H,w s c6s(ѝl_W,b|Q)vቋʕV!5pDǚP6FRC/zJe_Ggjyo3їo!n^DwGֻdĞEzfR*pʹ:fXJ^YS_4is}t7Au?|+g%OBbrTSbn% R;׶=/o^m葬ˬnþOլe&xMצ_j]t O7a3五6l$c`I/[4Z"[&jA*.˂F!z U9/cd72g5nuJ3P?Y!"$̻^SNk;DF kDu;IR=Hak&beX#oj󶲄÷'YS`*4̭$tLEJ$EHSX-"~!a~4 nh`y ;^ \%?J̰cr{HNtC6 ymfkHvn,6SOfY|;΢X W'ۨB@o<72*.:։ :f_0L3.Ѿ.{-! r-C%],*V7Kn#zG(j=i (̛V%kP0Zpkffq9=硗>Ы.Z:vh=0hoEoQdsE0i!͈v-[yjXCqys:?Vm,YT H䮘F/652LW}Qƫ9rFԆId.g4'=ⷹU֓nBz3YGRpfw_IdLPCU6ړN' 73rIa W?1޳v\9 V]wGy+%HɅ^deRGG&W.+;zIbXqase)u3*Χt>bC.P11j>Rے̙ˌ](n*h^9}(@ }Gef#ҰɄByʆ,羬cNfV:q1,Ȳ&%ŬmQ֓$ߓj?W_v&WB5rX ?U"EM~g/._Sitwm~1ET^-Fd8̾$ע/0TĖdjݝ33-W"o_ 5lMLOA׿LRN[-~vmѴ4 Ђw us?AȏS@zY%h~{KY/-~bi' /O?{}lӿ;dθ b|ORxK}8[,4|b"}?13biWe=EgPq|C!pOw%llc+6 M_.ZR_Xr+ɟYx>͚֤~_l5Gdw^˦.Ǘ;||j`]M~yb<;EƾQ}sx[5<)̦oyu|;xOԛS_V7WG -@OWTދ1l'o[+g[ ՠ޿cka9/2jdNw)_C mHWhgθ-7N_X ǛmD{0UwfKr_"jG>kv__VI-Jw;O>0 ĂBOǟsFH?g_~G[TSq9g]Fy 1g' C}}|9#on0ş^ zR蘭oT?AYZO??sVd9V姤'GS>OuL$މ?fKrSkH>P$v? v~ 򠵂*& ޠ0V1ä*H:Q%uM63)2v g dzg4UBlI H b2I[@ @Z0ovH~{O&=(c KVNW@@)>@ckO\AޜmB\ P{J3r=FEV",f[nj- a샪WCQe9*=Gr-QFi'"kPR*֑6C*_H]fp`Ƈoc?$h[41̷%iuO2N~~|sG^!eCk.ʆ>[=mJc Sm9JLqrA?<Ҏ.E}}6`E GDW5 J8 K)5lIǦHO5hvvYhT Q6]an#J#&:adZf㝐 $H:9.rtI 5CW6-`j`Xj1寇eT)=230d 2 BjeeCӚe36[Ū,-shTMZ¶X'/Z§pS^*6 $U]pl)f[\UBN Ph'Gm@wR9G$S$=DZYh'|l ?/,x!hP*Z[R 8;GɗC.x77\bHڊqQ$G.=63-B <"zgZQyY܆|݊_E%M8&'Mp4lp5xh0c!. 32TeMJg~oo98UH\c:%!y<n8`S$lȻo(*]b [x&Q",'I'8.hO& XL1Y(E("Jb _8ְs63r0%h|[]p;fj@-qt(Z=,:5(J̆"Wp̫1V, :9]Pw`bTdDEy~_t-CrR\w]X@C֒5ׂ9Xcsa<$ɉ,&-_}s/ÙB O`>18WQ$4GC3K, oeV|[;[>v )By]p\xFI`ԩpaH7r(7]98M̙+9*Bz%ֆf'R$Axexlh3G1~J-~ q-g}:,9% 5x2t`1 B>;m:q0OVWb 9jg<T"pKꈵInaݬ]F:֐8".'}+%7ALoQN_#- XάƐ3 !ЕH1]TINX u&J]5`LZ=ѧ6o'F0d>nops` p}]dRs G2O=pԃMMmD=ܲ#~ l^btUE-+HZC5AP^V˱6qжh{ѶXgSr1V3`k{Q!#jJϛj5:ˀPZlgWz^]{|VС&z &GVfpjtb!qp1PZG%"uxu(`(bc 6l)RYF}JDN4›ԇ=5 F֡]@\t.|)"QC(.ʱMMdJmF#bĺw%e?=vQ)KEZ+. ˜ }B]TY8J󔓁w}pLGL,,c׍[6Sǔ:vcw*\Md~^/2y%{{nKrn3y A\ΝuaNA@7=~. fPh'^j鯺KsQ3z }N5/I%r\lZ^2y@ WQ.%A̛jRAI.3}QQ Ι]kFQ?̋RkMf[!gphi1:L3(N]p>8H" <1ouQj=}}~Hs{ B:W {jhXӘ-ik)H}KJOw[!vSmEn9G;k`kotԩSax5/b w"-:Z5V]H FL P,.FR*"6^m|Q3fss` ;c#/Jẏ,XuHf )$r﮺8xpeg4'F :Fn܁JAq#e ˻>ч E]N-y=|#/3*GRL.fm/= 4:Ds>\`G:ًs`SZL -^O:܅.X9 RJ'Ds! AgF K+6jg w]]V:w,%Jm3bg*UCEF8((ϤSΰh"yxIY%Oګ] 6t,tKMTQ9"#iy_-GSObo*XF!{4''9񹤐Y+m )䤕o8Tf2p A4*g"^'P$q}hJؖmtrs'nzXR&nĈ`fұXeGm"ϊjK*"ge3̌? ;zʹl5#~G61.9d%/ \1e۳s{*a7*.̮SV'Zy#Y5?xcwEe?pRNY3vYMb￿ZmwaXl۲QxnJ"P[XpE <f]s.c>h.^Ch Q߽y7EdxG0ܼ;{۸ a` 2s{fea4ɦ͌^!y-_5%YL5-E5U] &TH.ɐj(Pxn2Ή%$Y*e`!`c-RB41]7]`=GI*sw͂CRLmB2B $`wr>`yFz# 83KCŜ1B .0#vT * 6 ŕ u8 nznDg= 79<ˡB\Ϧzr!?%k[gpb6(JQ[)7S΀~50eXyʔ;̚^$MD,CYN$9G&l빘HX kv9eBS5g3N52nX aϭƻ^XY^wV%"n='dwmѪ3b+"=w9&x"._.ApY \X0!Ѡͷ7.wٞo8=ޭ;Å^$R' ]h[Erk165HW 8\Nu.XpX8hML,8占73c~>yŸI>ޫR >׻ Cu}U&s>֧a"βL>/;+R評و6|W[Րǖq ْbb\ݤS{nqF #Ylr$qlϲ{>#m7̪ApdxP-[|Z*neFt{X sq7y1qf0r`Z } Q Vt>c67d<]՝ 9pfXxWݼK1ZeriVSSP\8I,XtV%)%HYgme,Y{.ԵC*ç=yftPlO:[ZM} oũwi{b2[,Ә!]gJd&EVs>j6G:{e8)] y ԯ"N"]bL\Ò "WZ>M,I3R<X{ 8b5IkTgAPv.SiZYŨXI6AZ$*7z6i[_I{*KH*c58b26`%8)k~jO&~,D[Q'VOA;IONB29+eR* '%J( oCiIËEt' qӋO$)g)`%ML` S;#l+ O̞a-QXϟsZ]U}mGi!+ ג>©syBtVB["j8/"ο͔1 +O3RƖ$y]`r >Lޠ33׶ 2 T> =W* '$I3W֧?7$˔/^3Rؘ!ijY8>Cu>=7"*u"㘈c ݺ<.k'l"}PH-'ivGTۓU't .IT'D(*Aַ߃MrOԦckFK$VWMZrW`Ba}av܁ ۚNLn ul8ɩD +ڹ{p, SjPvLk 2DI= tkUiYxT9bg ꄳi94|7e`5*H\ϮKsWD ʁyZD1ygfApٕeb=6M1q8W6 rKjQ1Dy1)u{ٟ#[7oW eOtIW|f|_6%L3{冞Wns+k[at^vRG(/h^AY`.3upry,na[EO7Mks@Xu{9v|(9d+j@#ٴpDֿpq IFqD`ØUk.)n 䖠xe7ESh_2BZu܋ܸW-8A;>[~znzc9Eti& 2ЖWBmO 3*>&< Sʩ}ч)D/p4·[NpbTun;*S_,d_jUX丒kHVEiFثi~p2P)ȅTol\_t@䫓Sm<+CVX'A2QGltc[؈3S:[3ˏ\;^_O%V Obe:#+ldr1o_vi}}_MEVLOJDPI>[Hn*f:m#/ԙAhaU/l:y}k-dcٷO/Ehy $c_`{+jsB.0Ɨ]ŋEE_WU `i{47=$#485hkXEf_]CI)wMZ+37Qm1X@8`ѲY ǼgG^Sy4i'ڍk 4}YTޔyiM7_vH ilUBs/~lpf%v!IϣB. Mh:YȺ.W= !p$`„7]J"l7(6(nuueôu*":/?@e^;5M}M-ÞՏ~vzpsMTa`J}t XTF>+H`z=_7Vֻ~b{ hݣ lRRJfE% ΫQd^‚a~A^~9J&IkF"IHs#Y|2)b*F~<խmc]jzSŤÎg eEFhE.o#x;.n_ȯ"R%ul m AX0i yf>_Nwq,?{8jD>i/KU\ffM4H,3 #Ƅ3#̑$)T"8EοkM^Q<PkS*R)w9M,O5pEm (fIaGR(Oqb 8:})ӦskpƮpq ibDdRFdh OI0IҊbD5D"-_M\w|yΐ}9cW*K:jIsӜȔJ KU9%aeZoNd&S狼餻TM2"M„%CM&l<$ \gnUq5ͯ&Y+V܍NIʣ7vh釷Dy2*e䤵liQT#WG}eH@{0da}S wOt}Wr-]Nr 3oϜORo KS?OpPEsfWGs] wy>@va;Drϳ;zqfH]@1Яf݅oױ|]aH铠F ;R5/-umеu˲xB8IݬnTG[2^vxv֍qWq>[l9tlzzᚋcxEn#sM 8¦]=IZgusQuF0"pZ|./({kUlzlP͟}tb<8_paL s%PcoPglkRe#U#L*?'N002Urpi r1Pm 뻔3Hq}ʞ>zn +F9tz8~U!p07#O9i]|q8BRJ^&1j>rlFøSZM~t7mnFWqD|[)Gu]6e"^(Q+J\RP!:*)| O7ܻ,fgN;ŃZ}??^GhP f_Ӣ_Ӄbz\#18isHe 7$dLo,\4?05E֧IygOCZqxDCjc_ށ?mܬ328OpܛsE]"һsVХNqR"0W@,3hm'֥vO廿$jK=-"И#L[̆25i̭Og݇ R NՖC"g^_dT:b54O,R_{9> B\.Zk_ PX(#.T`}@)j$đc|Y>X&U<.xRaxnbYeg3[.` |?j}|BQIkե00gyKT;g|\j!z A4x.HHt5/]~(֝6C瓬M tNA9Fٔj(F I -{"( YsI*%(t5 k YxQ0hT>8R7RSTK"0&$B52Qp,w\. /,E hѨ4jB"J(0N պvq쭐e!b m-U?4 W,:y) #*=<{ڪli3l(͊4*#Ά"0謱/]$Dwڈ(n'1=ӝvZtxMZ:ڨ߲knaqgךQE+ӖNXh4wH˅.bʕ:aFm?)-ibN)fg)) }ͬG&c(D qkAErXH^]LIwF Q)Q!x0(ʹ4BNKatZyGE +\eSݍ ILJM8ˑTf#"{)#vT$ ,a*rrJhU;;`+ l,4V4(K|ց!ZaC@\dAYi4yb(bkTq +nr!xk4mw*ͻ#4yX /$=C(вŨFƸVN*!NHfH}F> :yWRإeaIAKJ_1H),2eMip[}A9U! J2tHXt邋H/q24˼.ZX giJI P{*ͼ—Y  IK`ejWcxF $ _ZBl>1U%gҜ i2"Cj_Ɇ#cjd}ޖTXBhU)ya+ӱ(#1J!B)H[0ޮOyYitؗ.zU_k#&ţ&Z mĕ=i6?$-2L湉(1W PHUX0x p+ pG2. % <1-7 1ac+"vcS_ oe \$?=mgI`|0ץE nڦ`g6 "@2'l@0#lS üfltx_u78;dک {CCū!(f}8{PĔsy+irEXjQWMi=PG0ňZJq:gɽ;n9ǴQ:%9"[飯-.Q^J3VeeP@KG00C1_JK: 6 NS Җ ] L d[vB+'/6%@%c]U2*cRaƴqo-IY] xV (AO EߵS}n# F^n}g[Jl}ȷv6owHY[Pr: XR%o%Wg'\|zSm]z_MLg;-[kX q%I 2'Z),+9qW@VJcHZ6gW֡!IRF[wQXs'U=gIǟl!Vl?Uꓶz1Ak*'LК+&} =s%al~ZK\DͶVY-@V;l'f6DV0C(b o6=Hwn=#@p$ lm68Bj.rc$ۿvhߤ%TR̫Tk>xt2j,y`4s9x]VR_Ghzuy~uy ru#_{!r&$ h:Gn>:$NU7ÿ@U )R?$iPh*5h Y弰MGܠ'W7av{+FL zw{ؓwk'.7 ^VP6DY?:Wumaux`V;1:=-.>U1[+{aG5D}Ce]E6)IٶOrjli Z`kߖK~IPP|Jl–=RR+m!eJ`т⾨~7 o+[U94[gSl1Rf* rBؚ|P%T٠ Ql4*Vug8b1 ?7;VVk#eAdI`A$չ jP4T dV3ӻX [+k&i)r;dicTm Oԕ5/DQ< SjfvB#wc96 ܊!fȅr>Mo|YjZj67^fVc*_>XKh _֮r~h`ƕK|w ίIR]̨D=jAUW?hXEF.і~ ۚۓFm1qQnD;8<]n>%`׾](s4nB< ~"fOGjFW!FH'䭨-bu꫿nLQx2 r0|2" z7h8 :hNZrgo ;a [õF!mMzҞB:!MIɄ(ŧiCX׫dMU.h6}G׾~}/lFʠ?7z86gȅp]/5>Vz+J^+-Ǟj=!@mjN/=&QÔR9aHU6ĵ2 1 it"GZiO8E.?O)"'4$+iZUjaN惋O !z jxxT_|WGv鲹)a-a;m ӈՖIAObJAⲎX %c?fe1Mk2sye(TPeC++R߽*gp1& 7f7Ms9i֔9Xs5j^u\ Q \ ViV/474jᄅ76F}A~Ӌ7h.U0(zfVCa Zieu6j9h0']Lje>FUՐEClT)R;3^s_Y5n/nk]t F&#k\4I *;?_yy^-^:kEU켝gj =bY'`@qynws@sCtˢ #Mg~J?}x;?O}c hQ$F(l֫łe8 ?'/jZ[Uaÿ#H}ʨR?Ohÿ&$B_s1L4"Q#&Ǎj.[L4tOl23O[)8L!)ⴘVH-C8ш~]Ijzx>v,|O7Lg'{1@S}vfrnBRq㬓nwiW](5wtRaz z?wEߺx;b%֒JUv5$Oi vw-hޑhc2ZXE*%e0748gG>6B[XUQЄP=d|ܭ^laǿQg~/m\N*[F?գz sIAhHz/SzaJ?4u+8 1Mg$aRi-8 ZjI IJ& }`5Uz1:;`P <ҾOk|rW ?5'i8_o>6CC CB4ptPжGH\5#[v`r_V<'-=ŗtv`d|XC=k8vK3kCZkfiz!'d;}'VM Zs+&Z#6"lL!G D)a !9rm9AU =^:L[|ۦwX&-̀ ^TO8Ij00@Be`j6D1@#O2g]c|ru}/j,=3Y#kÁ=1Y▏;Q"("Fc @1Yh t# aVd)\=<{QA:{+R0H$%/Ɂ|rp/,W_fS|D)ҏ~}𤣸A ]h}{q&sCt] -4*s -Ե][ ]C vphٛa) mY?S%U6Dyg))#H1TqzP;^lR؁LS;lAx°} rvI3ɩ d3@6TN}mMd@cB츯D-d>H9, X,KBl\*KvKsEm |ڪ=in'ٴχ׀Jץ>5H*q18" fk< =7v (ҏ,)r֤ P)"lk|Nz嶟48GhBbE\-8z +^8h@'+9 W2 W/ױg쭃bX - =ڡ,)vh`gH.V#'Zm(Y| '` d)2*3[kn_V5Coo1Skea^EAX 4͆(Z)SS[/ Cb:D}a-?Atɘ-^:di6H1r.s֩O%Եujwhg @cPJȂ|r`DY_6A{6D%@ v%8A:Um8Zޡ+p z]d2dJkRoy9ܑ~Y]G2F2TEt.<3jՋBs2Aѹ>)wVNݾMSYbHް%A   r一6Lzu7`d|>H92 64X؁ &XFP`JyR R4U8lC+٦aK8uec0^gۋiv93aO&k Cǡo>.+Na3 ZɃlr`EAnνuiJGcMͽ]&6c-";5ݻScS5YR j" rg耞69vNliAnCtSS%#mXGA6r05LѝѝwuGw^ lոT_.x˖傜A PSGiԈ]WjMY nw!E4惔 +46@CX-4vTH脒@8 K$ ZAʁOQ!q8yҶ^Q- Utf)хJp4bx@hFe~T,ڜZAYA3e-@ : 4L.^54Y9^K5n ySk&k|ek^?DBBp"̍=dr(ˡU`wJh؆*EȫgT?!e3tMPE6iBm-!" ! H6|+kAN-7S-#H)"wiRȥף+ۋ߸wJvT6O`ieb de6H9@!mhlY@YJ߽Eu҄jf5 bW}W5#Xrn!#2==;Uvvny33xѷsX|r$MA%4~۶8;Њqm 2xпZG41\́7a@dޭW9z H<)+zBB_+SOz~ޭJ! (VtH5'#-Z^_# Q,~'xd0A0`9 *AJtPٷJMJ)Bs\!uˍ4CA2I)EѲ6U˓ZζH(XOr2[)ךQ(WחmN?ϭ>{7iGWq]{ ّ>Gƺ9ZǑCs~qw.nToUoon>|WE>֞YE1}ۏ(Yãh퇁@o6,ܧO7V=f^'e凷W߂Q c|mz~8v}ac咉;B\9Ǜw*B.&}i]{mB RTb_/nӹ3EhUEloJZq-OjRwgS9n.UV}'Rٮoе)CκBv{P۰v}}yܓrO͘;;JBËˮ5yJYGB(ɳo{߬=Ԫ=yogy_8v{~@y= O * |Gٗ 0wU4i?Å0E61aXl ݈s8wh&+xmj\+xly's%M)SRMmjtp i8qVf ,s@;ؔpڕ-Yw_o?~.xvlr'gǿ_;ͱO?[g}9>9߱~_;VǯFT^%}P2+]5^99'=Kw*ƂruJ!4si\u;ޖֹ=|y^*)W{'yPR׎+ta_GU^D50VX W-Na6)pP~ѧr} $3r![x;oƞ7%5o}qH堠R>W)wʈvqŰ4n}.Γp/ne|ݪyL;nN30]laW3ER6/!z#!ؚ+8thw+/.=Db9lb:-o܀!֖ɇ>\`;Z] 1UsSJO,בUP.@BueLS+L`l'`z3A, kۖ:mT:l쒒GGOIl] 0J|J|PS|8ds-w*; &xh'+~O-Vq ?B%H{]e' :WzΟo E_4*fܲLk)|؄nTd\ d.lR.R&FS+o+\LCXJ9ҋ~Rp;ORuk]xrڇl?fbwX:9>r ^jzD#W&1A Ef7q&A"SD2h;BiB_ttnm"L~ s>wa'`{<<&.r/s>4J̶ I|7 muj@ҹ8}^őKUU8hj?"د(8n>Jj6} u$#:+A]Kbj{zZNΆiV:N3bэ KG7>'qb>)W|)%sM)2z5)RF".96-bɨ,@Ȋ WG62fϴ7PCZUsp ]5'y70G][r*U{/?7CT*wɃ]$vuyvǒ8]"cD){V#孯({=d<5C&#ױv q Ҭ-OVuk!峺^Upfɣ  ( 8r9YT٠šNQ@~xbν#v~Aou:JRX(u:t{O|nr$n|EV=Ʃf=c$̇՞#,0z) ^`>zp> 1Z)-v' ĚG҆y+ eof 6}2K7v.6KhM„Ke-$T )ȋ w4+L&L4%[׹7[e(Yh/;)zvOBH+osZNVO|FM|vHP,9HcA+/0|<;TQTBR:md`TbPV\ (Y\Q 0R @j$- Q~*+.z,!]uS PJ9zPp*NPcmr&iMLw4^VDHRVo_=Ei#'D߻o_w/n' ]#u2MI&"x &0u&-47I BJr$#R !Cթ!h(.AQbq pV$ٖآ5('o "A_Y15:t=A"ddj@/?XK!xJ0e绞r8JzMdhE!,LI" a,\ e%J$iÈسw4HjΥ+ EL;um ͵Bsg^hhДePp3w -4 { ACTe(Y-"UYY6ڀ{B` GtKmj (62ETp*+pٔ>#}W؛U)0@IfxNzLpUfmdw^潳6sDS9KڬiEfNVZ:͆76)j<։(2q/7Gpyfd%SEn5 \֚xL&DzZOۙ{*&(zg+O8QO#RҖZ/ X Aj"T4.I6 QRld^$?73@)Ux3#&P[T3qb2jjJJ<֑1ک2ڶz &awqQ#V*vT"uS&ߌQbNYVF= hkpԤ3ϝxJ8WP?'É3(9tyEgr3+|S2J-f|9yqMZ]Fnu]YW!oC3W. `oz-~&ca.qF(cni,rx-qp`qh֯^plOLnJNo1:Vĵ ٘@PT4Ԣ5L KD̛܋#W٬]"ںkvpR8QƔrFd3~s"G0Y2`͛ Zn0jO”lW6uKm6xB 8r.QIsgLMqP){Q(8IR٩<x9l3Pt-|Rn/&>F5X}N?K$c璘%lVq6n^iv _^WcN?NG-[Tzu%K3`QV"ÇjQW}pͽw6ơ]-Tyoݟ>b:ӇG.oZRmضYa6jvnpL)GI>Rϔ}?c˩Em?_-BQ·S4cVO=><ͅߪ0k̓rd cOR_w%Ά [oQv%>&A L'Dk(in8aSm Z3Жf<+ 6Ԉ0C2ܓK&>L+TsVQKfB+X!\$dN DH-RcRɩ Nmz$SGxj4 sۜ\<ҔvK3$OTA,rh)'I2MY^() *Xkk秖053Jg` %,)XpNSs"W iZ>#2?9'KC\%:u¤tj]@Sp|!VrC)EZ(umJإfC $7 IDbqZm(>RpW4$)X y# *`vvsIwqH4YDYU|0dw``0;l"3,L0,˹V՞E8ŪsbLlSoAКəH9lV0+ !VdvuK6dpE ,V'$Hd-DH ѣ8r@hl5@Mi/9Ec[C Q!GmpK15i(|!F+6 <2`3 x./wd" ʺc\Eah8bi# .ڴq63$T?>H-HI!EA>X u D>a-lʶJem!kR5g޼9L/ bɉr·<^w<Q0P$P 2)qT" .-LE2@J핅$B](5D8%ťHtB7QB"~sVF#,n udS%PZ `x"bZBJ5jci)SoQɨ98C}CLzE|knctX^+JM\\=cvڊnS`M{]a>mѷZ{TPL`1V9)F}k`f;YՓAZC Ĉ_~k Ie2(Umq =CIVJA>7%`A LuqOdbZrXzepe9b^NuW~{A 39RV@ŭ8Sl^NH=#Lj&`5oGsnEb#Q!.FujU Vgr\5O 'w*g f8 'zG qXJ ( 8d~VEH ^;;B*iHV䒠bQ8tär$`> J#t11ʋ~rz6VxM^,!E@6hM$0\.g:P\h6Z-zߛif4X^BqŰ.FI-@MX; `ւ5Y4oCׂV%' p '~3O2ꖳDQ%{p d7Sjj($\l*qE)'qP;[}S|? fJ䄬S1VViaL1qm,azo+0Hfy bZ1ʹ h f{UDQ!+H'2 l ` e߷.ͺ piXV%+\ $R)!C*hb=~7"$PeZ1J[o0d@8&qQ%P\=5n?^z vM^R #uC!tQ"_P\b ({B&>RJ b )J ,Q|oi%Im/[#֧8uÜ:. 8uSga0:Dqé3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:שcjNNcQ \ü:گ=8uN?S:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é:7|:/uB=ͩ;ut ѩx?:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é3:é|:n}4t뻗;:o׸^o~lxProO5.Wh5W5vxstyex 6&3G^ {f7 (Y%X X|XN? XZ XtL+Wg?D3k7X]Sw˷oч;Gq)0 7rX3Ƚ\15r¼џy񯋣A7+xbۣ֮ۋ\i%,uT_@8Wj-{|~ݻڧ Y/Gk7w{_l&fij&P>ʽ Cbz"fmbV֖(F|W3}Lr0x\Q_Wxsŕ{_7g"^a?=*d?qԒc#ؒcw!uWϜ^NOhgު&cC?^,G_߼䟶^S|N.#^uSUulcm5XY"zߖKҤZߵ:hv>g%ƋS%%̦wVϥg3J)9lc0l 6@V 0mi3+H5X7uE(sFO+#ݼryŃbq͛x.'%}TN^*# !274p-u8xcgvee3Opsk eYEm7N3 f 2>[=/:}s׋H~W" ڞNk1Y>̀G{;xrb]o1O:_|cѮ~\|f06wFj׬ޑ>ؠ?ݚջp sǓSޗO|?F1zsH代o=b֣}+^H;]m\,Ϗ:G̯ɄOO;W'v xUa2UX%_R-?nE/LrRRVGy7^Lre/fVT0X՚8k3^%7;'幂},Խ;RsЖ3DJ?pV֐a&`I!gA&},{-vYrg,e~zV{O:Xw?]q%ϋn@ؿו2GXCyk7k=N֓V$+xo."oV__^Wx4~y_rqWjwzKu'_`|Ku7Cp3h2|A[;MCwrToKuI =-"ܥ&}ȭ軁%FLB?3kݬ%00TLr%XA39♀%?Ea{B`{'? #VfqLz&LB;}ekNU?yfG3 m,v|``rSd{XEs ǁ\ϥ(Qn ֞*Ah&`7$$NfkDG^7X#bB X+d.`C),}= .f%X ݁g빌,e\+FY?5F`[lw[/_l֏<"^=g;_}nxnp_wnhWNj,O {  ejږ5joj?{Fre !ZUu).bd33@f,f1爰D $&Eɖ[\<#Qd}ֹt N^-˂Uvw]Lߜw9;xx"nqM5n\\J(;4WǛg jZ,ߜ[Lb}-fnl/G=}.],oNpQ7ήn#ٷ߽j_׻%lӺ©5⩛Ӽ\ыrbl]]\fwC?ܻG` s6KɰIM9IvzHN\QYvw(T?.!CG/݊x" Rz!.kիs\:^7r8Ybh7;}9\RZm_sū۷#gWO8-c G|hNwx?}w8xsg7Hś/ʣ=`|nvcËOK/E@r nf_f>-k~mmaMR>ҙ7~B'_!л9f> ޚWZ+KB)-} Ԓ%5(έR0YUY%il[IųZx;i"Ѽӱr-1P6E>)iy#S9 MJE+F7ɻjqHzXaYoSn5D3 k[1kJ=RsR^\!M.ʃ[i? 弮ū|̈́JOCS9j-SV"KcJ>g;5:b iûFF&/l*5QqݶC"XgE'a>jۢor*0R9 OhD.P`Vs QtŔ` @iԤēc 0Iل~jfB9oĥYpPUCsjjRx=5~MHB!;^-Z_=(SAVKA[F׺1-xC|jPŀ@ V"k>&WUHF!9ȉȀlm$ SW.k` \W¤-k@@!}FfZUc pV5&F`́٭Y15O݉Ox}7[=j#2HK" ]=[M[aEw\&^A *IaNMN#/$J"52 P ZAn׈QZ dR\v`i֞η7(7P E++# 󢥌t"JRT&|W h_5KRĚbCYCjܪr h`R4XLTF[BiA5OEL,ϣ( |Dd6*?2#'M@(+(2CU<{,%@Z;`dW! qhlj@5*5c ,$\ 5i0WZW`I4ZYHGbEg2pZGLF#G i4=L l4*dePHNꂑC&d0W'c~!Jɓ OWhͼYox1T,Vz ݺjE}b-" = <y:s ILo?2,dH. *g# h&#$p h&o#َuM;ɋVS)+ř|Ņf2\&b fV+wmh(jMgY 2IdojR5ẅ́X0[Lh^i+2Z9f0 [8D7l|-n9TZRH%V8kP I tT6UQE;SCNYP|+)'備TIj9᯲Js cPkHks [4Y[ӤRA4^1Hʹ h!+A E V64RQ2Y. êd@>% &WD$}R%*k%Ɗq dfT  枚G7]ިq/<;$IIQ15go7zOwò0RcK4Gq./s\ w{w^k^oo*_~yk5)]_7uwUٸ2NyswD;.[8[?U{xq ^jzKŊsc(}H{17q6aη۫) T)8VۗRl!nc65ox6sPP䇂&s<~(k?Bu?2eICu?TCu?TCu?TCu?TCu?TCu?TCu?TCu?TCu?TCu?TCu?TCu?TCu?TCu?TCu?TCu?TCu?TCu?TCu?TCu?TCu?TCu?TCu?TC}~(+uB5t<~(S[GㇲS"?OPCu?TCu?TCu?TCu?TCu?TCu?TCu?TCu?TCu?TCu?TCu?TCu?TCu?TCu?TCu?TCu?TCu?TCu?TCu?TCu?TCu?TCu?TCu?TC}~(QrxP_ Q1t?W2.?}£wvӒ5&zbaV+.=?4?įu;!u6tr W.2چ`ѶՑX # TjYO ֿIe(# 2I:6B5Aq,Ա,/%y O.GH4j{!!-g-0m\=@.˶ƀ&eAǝ}FK)nd})7p#WzykMT- ($o>gъSH'Dҙ,AN7H}ѲN0zٵgbol!<\~`g!x5eZ-xd|:nm[;;Wʹ`:X9قS|"_0wα)bKVK>wĝ?S/7<1BGI2'w,i@;c$XM8&%X0~!H[6חqv{%fs6*՟GA(>܍O VB1L?$X%"y$eh5I&# oHI^%g>o'+Ql8`v{ֱ] !SY3{ Cڢn4R( E{&)J},>)}A[&ufc9f/l奀e3`pڇ/4RFkF-l6.v5 ^)BjQ^% +.,b^Yq)`iq!` N`#N  ּR&.,h0Y&^XѫiZX&%z)(CfR }/ ywZnGWrFW  ;慀%c^Iee9Th)` ۅr@[N"'ەZjes5F7imQVHY߿˟7>)\o/du[_$ԧqdak_)1ը/,jrRq!`QDdx;1Gl41S^8WfXQ ZRQK<,oxQ nȳSYȲ{`ݑ`xNLz=kx)#kr'n{5s3&,nff!`ɢ kl8⌄/,ۨRF:/sťwZxo`#RKё=Ct)1[?yX]WW# @;wI}&4'_.{ `{e/kttnx*7&Z.&E\sPzַ7xA> Wg{tVW#wouԫ݇޿7_0Qnߖtw,f߾?啬|.ڦlҹ(7Ux5~;@W9AO5޷iS޾ݥr;cW_oop!V%Mrķc?CK>Cz;ծSև* S|}?ïA*lC # @ yDy7{°Li8f5|8xhfg&'L35niXHJZitXMxf ޣiH7R] eT Ơ(Q=xMY2`uRx$krfMmRcm S]?ЯjC˛~9 .gwH|rV զ&zLASжTCܤS-(PLZQ|"m΍k%W*Zo9Oט;A\sF]2J7343cMn:[M:6Q%mDKNKM^_߀RM̎(*`j%};2Tꔓj&IJ#f9:x[VI_ڪۚIycjw8FЪ®9'BxFb3#6Չ[,MF#$3 a5zIԣF66HK.RiG!I0:j[Sx!%.g@xfazo'tSۼB]$M9_4 6e7am&g=IW ƕֵ.#$b)daE3U0jdSиgLg@xfԕTͼʹ"k6#mdii,9yLJqH=4jƼHypܧ㡲fֶjDNX03HHLq^f2qdu퓽LM$䕂TcHAS:W}NYgxl"xZ Og$,h7FHY|E;ejȍYR!oII. v3 fcj!mrl/Z81^;ࢳzXHz 5ɕ;)Y8r*%N 4*4c|4i714a OiGBs/=uH%¬ +h]| y$ ytlQfPhLA1$sRsneKk0]d?Hf%BsDL4(:\3PMIHg HjAol{զ"cRAH2`OA/ޥ'*t0Ȳª'V\d[Ǔ4Bj ,4ʰΪDPr* ЮPɭ'D3D^-'ޅK9j),,u(Eue7'7iHrst0ߵ8CPoت"{(]4)!oKmF#u4T#Le<#:Ξa)D`e#JG#m8qf4ak;Ώ9 8?G$3k(ULQ2 W0`BW=A@:PIv 9jA XR*:r4plKoA~1 I`,YF,a);/ ,l's7N**TMHO W& 3PB%49'EFRS0k pSL`GY͠Y@\]Gom-'{181Z TH-NE'@X$k 8fjHEKNv>W:=j=\5iݯAQ;To7`Qh pC oi =Fi]*6TE@F%ej0$ NrL2tYwqm280eRẁINEHmK#`%Rc\F"HWcAlFÑEZ@qXG"קԕnrQ6e֢ q 0*d5~:*/eH1Tqy3h[02,!+TŤPJ@tr2ׅ7nr>7bd2,1%#y,Pd7PY$QaХDHE?+CTV$̈&&C%Xl0ܪt^N *DQ ^k"F1L˚pRdQE#N L蕕Xl-,`YꍴO t` $ ;UO:Ap=͖F ʚ So\9(GU X@0i`0zB*9z0|q,ˮH77&'B7Tqzph1Ҧ!6sB)*kQ"?,I &UJcPH >0YbSk9d"3"/ !DlК|`]|S((@qQir޻ElDi4Xe= F+㢫5b4RI ET4I6 )V&rr`a"5.v C1cʼn^&yi {A(iVea @v1LjF9RTl#S[eo[ʳ,EdDaqzZIԒJ2Fe cv a<6@-G: {#`J! ELP%:Eó;4ɲ?J5&-NEE V6oك b:*A][µíjQ@(JR&֓/B$1 2  CE]]Q`g]9+kQ?,'1(4!4$)X Y/E#ƿ"ePxgڡL}ʵQG %JIYRU0t%j%?]I`P"zNoof|{n}513sgsԊ]l㞳_כvO~u]9<̜3k:O=s$'ي@oFۛ][ݖtu%[۟V[zFw.8fsa0!;c>fy~}qJ?ͅ|_d_kI7k_VV6dC]9JWmԑnj~9J%+u@ ΟQj(uRg(uRg(uRg(uRg(uRg(uRg(uRg(uRg(uRg(uRg(uRg(uRg(uRg(uRg(uRg(uRg(uRg(uRg(uRg(uRg(uRg(uRg(uRUXlIJxҒz[NO;,͠Ա4VJ?RiԡJJJJJJJJJJJJJJJJJJJJJJJ/W vIJvgb:@:@DO4:"l(uRg(uRg(uRg(uRg(uRg(uRg(uRg(uRg(uRg(uRg(uRg(uRg(uRg(uRg(uRg(uRg(uRg(uRg(uRg(uRg(uRg(uRg(uRNwyx:F=nc~ըݾx`|lNfQ󦭶~w׫KYWރj. `?cl52 KȗN-,dKYڥ޹3Q^X܅ >lI#Fd0X:څw)&K1FkNKBzY|T/EwaR$H<-N(Ie-KTWE@应l]} `^K޵6#f03 <>4z&ؒr,?dٖmz8rkfIlbbگȾ7 X DO%R X*f}!I \AFUO T&''* QEb-T]Ȝ@,R&Tb&KI T QOeJH&X]j~XIEa[ߥj;}d7#S<mѲw72 `SS+c.f"y48V)?fD//I\~ʈPMӯ7+o!xGIS/Fqoe$jYۋe4* z]>a2v詐G['kWRGA }5WC-P)`_B(b_wKJeq1TLY\M0'7אP_,nq +!h̕!ޘ+C}1WZE_"Cs+H,Wd4ޘ+C.}1WZA^2T*>\1)SxA<)se7oZ +4WC-0+C}1WZk7WJZ04C_h鞟(Um %>MK_(4.wOâS?}yx=~@?!Cͺǐ|/t{;"JLbPs^<Wʾ?>oRhK nŲ&3\Θِk9M~$Y5^P2!]ެ&?weۂ$e!ѩD;pvA 'j ?iƗЭX6N`Eu\,Ҝo^mUHyUCy y6h7p^kݽ.8:܄"!@| M (C?$۪I yۖ-W,3=?-4B;J)#мßcߢ_ y$e#ݠ3 r4]8~B n͔m렯˓GiN2z,GoVvm}PŸ;%CrG܋uG4=>l9-=k7@;KͷM|cC?GE_@RgiY n>KbO['L+fqI/GM=Y鶨3>odk!gf8Q O4eyx8)$1hKyrc4N,Lv6!K+z'5fLp{-Gywb[C@5qPmL}|Ft \sm0pCAuYLSYRzn4C'!h?5=LWq(>rx|Ūe,vb\Ϩ~>GDO*᩽@jv}J AEehΠfWN~+A]n[u䳊Ǡv}:[K>yⅽG(]~F[Q)){ PL8P9]WRQQn׊ljhVіmH4y="6w&ODa[_JxjfPq cx . AsPw ԇsCpN}8>SΩʍ_mRjycݒ5H7OK'n|dM$e; YJu nu.^mۢl }Jlj\]f p93DtɆ<]wx!id>ȋMry/;x1LFKZ(L'n]>Nz~O1:sp`D ]n֎٢BPH(LJσO!D" À !#kQTEDݻPDd ED""׻?,T ?7-e7KJk2,ituxpSoWd{LjYWwa# Z6 qc.ǎ㸾G塢FMw['kU wZRym\ u&@fΪfYeOb^C ILVѸ|K-@ۧy:Im/psP}hpCC] ՇCWJ !eE0@1X|, Ki/Z3XL—ij~??j|rj3v4޹{2_[MOm(iK{#@RsJABChuT1*Wm*KJHmIg˫RC:f aqK[m+QOUSBQNBZ 9ȥaQ%X !h+M? PpP{'PaJ,^rNDۍQ [MNbjhjK2_,Vm*;Pְ18aŲ!}>ʉ85\.z# :{# A+hkH@/\'1-7 1AZ@loF "gCحm&oytyNd{*T$=}Fu+VX t72>U\&vifa:Z~Fb=Pl:~hoT.{~Ϳ䯷{e `<+LB`kV6(XMĬ$Ϳ%pH3U~w4gMm](ܑG'wՁqdO}\>vDC8M2u;0DTlJ"Hr#,J'`kEa1\\j AJ )*GVCॹ1~hTfYCgg1hI%==n)œþMr6Myo4IFAd܎IǙyЙ<gyYw77YwȞNxuxOiݲw?2UGH-RnSEi.Rc#GfW55^QF,7#aߢ>a 7=Qצ&bpu[3*Ɯ- akcȓ_:RvP:U_ K}(&~H&.&)wB@:=Q" @@ Pqv5 . K5xV7c(Ej('s}]NbJ0HLش2qHJBĉjX"61- jOI>.6>:RLeѵ/wD޻lX]3J bt\ kQ1-g)! σaODh6J(ȣ2ګjJQ4.DP_$+d2lY~3B|yJMډQBŅ c 5h/ep|A݊xTҟ _$)\$9HP>I4$&$,W$j$8͌*\qH)et(϶e (i/y̯AΫrn+QcMp҇&/(!DRG1Ι"x 3חRCHfHs[zAHA_\C ϖ{ ݷk?D=pb qؾ+;Jkw : YN9}jњE`c PW6_b-m;}j$*ax.\w7fbf~ o[ҧ64~DMRߪlUً }s=xb,CBHvÏqF+b`ȕ;Z}gmY+8VIAڲ;!mZ@ @Z19+Y&>݅(T.հVg! Fh :!dM0F Kͅ8yXa(HKFFjȀǞ*X,VŧOVon^x60Ta'7F]Wd!Hd`nAHw'I>|}?A_7P 벴n\}/qyqUhsJO,]}\m\x[d>[U7oJ;(Cxv2KI0bOϟDOJO~OWv!wR%?Nzs4 ˙?J? mm֥Osz^fS20F[e&r4W8LO~/**>(ܪ L4j) t݁-K&7l'U,ehկgG GeP"V8[~;($lrs2lwb/}%01OniFt?-n_ j@'lyݖVߪAӣog崚=[J \2RDqկ7IWyY ۫YZèaiw+AXttUъ+j[!S+JüBAkJ=?U` ƉHXԌE5*dTƙ{bB#6- ɃQO{A+ 1EL:KZCc F˜^yWc'0ذ vVՏN3MOMc(GM4 Rq 7AhF@R'0r oNc$@G?E_\X_"X24pvGU#^%+Q H0"Z^tquQ׹“n;E_glG\H)o?^,g~6lz9Z jU@ B ~b6=?|_KY2QFӣXrѲ) r8reF-S72QDPF]Jw>,EM& ~Z{<1ۚG }= "y ߦ-CiM`SЫ2o}7I"B|5XOt*c w` U/ _ysٳڏV2 c[yQ~Hƿk#qӵjѽ+tqRm-׃>UЃ|pGb wŋ/1ii̍0u[VmX.ܱz0w)]=ҭ(N馋 $SUJJ-,r~<)W`b2&AhowRp7պ]T;{Pd _ϑD׃Nf6Q]O{W)Ll;(!-_t Q&~+ N8_}~Ϳ_ݛ/yaW/޼˸"#r`Q*l? h߾!4C"դ91{]1<.'r-ӇmVbѨ/~X~rm41\ լֻUy?͎ܯBQꟛR5 '%k)Q9a$ێNt/y??4ǕpD9ԛb$i Ii[ |7o]c+3:`uqM2%E{3=~2~[Sܝc GQaVh!A'2@8GC85 1 h5_|S%K 8)#Cc#IkdB!Dq};X:/+ {$ZvI8J:/ӳϾQIZTfӝU.myX9͍@Ց2eoJXu`Y:yؚ RRRRRRRRR8fqUW=o;MlPVIHHkM6\lKMBM3okfǿ_3`Y;1V lǁb"2z88Nṛ˨Brѓe\աZ/PLu1_Mgۊ.-%%Ɲ4\\04cYp_ME0LKY nuMϴg)G&VUk1% ՟!y}@ JU(J0Zb!kxuzm ނbo:=ysY˫AUM޾Mr ,飛tƓ ƙ<Ê19nףYQ66u->9\5"GZ maډ4~*; T= eys79oXaT{Ma!z#Zi0bDH4f@.Acތg. rioy5"؊!ݢT"SێoA:>5WyYߗef^&eb>/3c^]=ĺLOsCsWc,;i)Q{-6w7H1t\]EuqE˝_4.ڄn9Wz:t_ƭu#_0sOkؙTܤ g.˸xAa\0y)&iy鏙(J:5zUq^C''@Ĭeǡ{byn*:a-[VG^^:{Lw3/%5Q,&c{'LEL_7,& qW1k*'Ҡ(b6_>q3]|Lҕr @%@*/:QT^T8? 盢/~ n/,St{[+]0: x켅lC[ XC[6o-ǽ֤J`elΏct6({@qSD5QwpA!D EpaDF)* fs-6f+4w2tW:ʀ*nh OLqW` 9߷X~(eg$˰9EC+}L03(fxH(T=9Zc~ H{Mnk&yoX0fvE9Ѱ<GY>}b&[ۨQ2倀!M\}щPDZ5[e+쫢:X HJDZ#Qy2U Wzj9GzSK6_gYy4ÄmQR9X+$bD2hGeEk[S3w_\n9Z,u1}Wh3h?)5b$jh1<PryCc!w0La 'u8gpe5|}S݁>)^8Wg_O˸x##~J]Ir-Yy?7d~-峎^>bP~˯n<9hUD:jͨewfcvax59ʎvђ+GN΃61:Gr fw]#0؃ W}=ֻ}5\98ޱEgr)N3vY:Gj 6P2 Z~zfߌwMӏP"(I`<(m7،PpX5HMG=Hs nC @9@*sL(ÌwDrr>myyyBRq6;YhE\{ci*36Vm`qMQ(jh1P"?9F+ ov2oeg%rG)k`)MEzpr -Y:.'z78GH +C(јQdq1C(w+M=G;Ȟ~(y{=*d\Qƨ@ZJPiqa6;we<swsc-BԘ CſH@F3p8 0Kb8.~xD!J}֜)NrM! Zx-㻶~(kq|v{\3n.m]|! 6,*JRFjA98Z,?=Fskʹ4 )"Sӷ)/AuZdUV9|}s/r{^/C~'1SG3N<-K>&g%2vE>T庪ZNN <6i0U^Ōk'ȇ֍wi%|g}JQϙW\$:Ped ."\(iDf>s4GU\ic[Ϯ_u7/ןV f;t.}YM"Zaf Ǧ,w(=w2=QCU>-yi5 3BO rJ\DMjƌnPCl͈plxF*< RXMh#h &Ĕ)y8c3E>PŚ#Oƥ+̡vz^0/h{NK]k*nTqoZy|遦OtJq9JӤܹ$Ejp 3QER BrKGxK)_Q'XHJ0w%$-1F V^%!XӇѮC\8Ǝ7j$|9]BjNK'xo\kB62%Ww4 iraA'y4 3}"T!kqpL "\&ʭIi`n0n)|jtk4-cS q>z&UӁ/2ec:dTmrFҩ5 QWԼ::tt lCFŐ :zgE4д#x {_$t3X= 4|~di*=n0g+vΌz)=D|gp͎¼Iy쀨VENȇj[>ދ?I2[,)Ȣ !;U, !Ʒ<PU}1z c%=^oF6KF1}1GKqe}ự%~ze1@2EvQ5]Gt9<$^ds\LEkE>T(/ 3[8bDlbV#C{-D]}0U UU!R~}t~DQ_So)MSx0u5D[_w8wOD;"1XcL/|Lpk{{T[pԏ !p0؅7Y8њ *t@LD=wȨycWj:pkQf!-A9z^z$;8]OaWԧqbwȨzk>]ҧ$ګCF0_ !S_hXcJZR8y  Q>d4^8^"x/]QCUnt*7q#@d)FQBveVx d˿-ǐU啯:d|gx9\-}P9O֜ o$)- oVcZ^\=ZCX}c%^0%e\GԖ.棾qn̚& OF*L\L%c)`er5=I|Z-!l R@I~G"&٠CFyv}z%:#IbӈOH|8XJ.!b3v|GKxA]TnMtI䛚rK--&KIwqPպ! 3f ~zfQ Nϸ?3f@5˕ VEsX/5ysj!b~I40TZN9 F 9wِowzVf_ㆺȾKVǥ)W@"QCUnxD{r^tϔ_jrzlN @TtdS2 277WwO{a%5.|5@@AtȨʀkkƼ2uxi1b'C}XT!47#«8Qۙ(Q!(/7#c*K. (rۏcFWQ0=PCꨪ'_Ⴣ=02o㇓,؜ɓǻ EK.o>:ǁ9}mE)Rp견QT۬:1E2'"{U`̄TpNX, [Fu2 T0rD~]j׾f <,+ۚjM 75h^ Ve%02{(LA,JX]Aij05M.`v=5?M8&gfO'LN2{'rtcJ .u҆7XjKȖMHC1s&Ha$?nrtg[RkL){|+~hy}3[G'{YQ)*\}úҿY.Zq7k`BVW0wOβtmvD}MN c! 0w?.`f-5euN,&˽=r}M01,rgۇGY`ާuw?:FWjLUnS}tF_F7 4ջ X=c\[# UƯv5랠wi]B*&P^sӣCFȅBd4+L_nDƜe~֘FLd1弢>1&MCJP:dTA2*<iЀ򱦆SMDŽ mzȇjLp_]G 5b\KG6Z1!1"c="(8 G꾠,3aF5sjw3 K;*lhE<51VPNxJq_"нCFEBKWUB62%fIͥytm}~P&ܚՀV2*n[,O\gW aqG+CFPVS B`Œ} tj[L0VVYb\,1raل=Id:b=_2&pn&2mxPK1T>o^}onD]9+yOMph_fa6]l̾o V~͚y?xa# gtlM<+.Ι` P>E =eIGN[3 B-M 'h{K9CZ#x5|ՄmʯAIQN$bLr_tE9\_FAM=n1M;dTJ[+_ L*}yԮtQէ{zD팆)4V[bM$tų,ISoox[C>P國!HSc^Ewn~mCFŭ UHcNLC1wuύ_aS\qUyp\)?$v|~W%)ަ򿧇Ï!{DH*;{nLF7^$8"~iQ'*W%,d2ad>Ocl);'Pc4%R1OFEZמ4EӓFmw aw(诺TN^O5 %Gp] TX◚eP'*Ur{5JZ&FkɪDzES+D T/5K}OSV:]MgSZd\Ǧwu騃9BTA$$I(O')Kk#x4U"?U&OzS}'4ߑ\'kïa-KD=i`*`^sD-j.Z u w(Or 4+նwi|lܮdz>MZ٪^Y[V6T7d$8{sNrPi2^ii47O1CIj ~-٨̪f~3ꖶG;BuQYE܌M_3}:36,W8 ;D:fI^ƒԊZW8>֟tl#}bOBC+ۋtXӺs_UԘhV.7Y"V2IzϣHNEƋO~~|bs^5>a^? !3y/ձL2)C>s)| /r {wwʱ]^;uiý2mqnfᡈBm*-2;%[g'#?~ի DRm!sYD)F2cKZe: E(_T9.;,4h0g٥4t.vJpe,Ix3G+cNSCMZ"rE:*]@TaY7˝0*lڌ,iL@g0r-N>O+-⦷% q1MR7WA7oW=k(TlTۊ-5O+C6KBXz$ &8/a.KK!6%cݔۏ$O{CQrHIx1 sع-<:(/fysG3}4 6߰ckyS޵/;4N:ZXN{ 5FX ^O}AReL1*F<_m(=R04 xzSd [FBܛyq2rAi$E;Wwǖ;Q&2cnm>oS\\B3juNh)Lϟ1+y[T)ѱ&uL, -),OO(h4dɳ珗"Hf0}ЏǕ$@S2Uεh#HdyPrͬs|<BGEmmz֍=^~}#D [Ate@%x ^3hi"8a{T:OS,;Ǜi69 \/IzI"#{~^}Yk:y׺ճ9VܭYH+ I$*fV$ ,qYH P1\$Ҁa(Ha^pB|sjO`L=+Ş QOxݴW9o9jA}G-Շ3uCCg;4OwN+fYy,Z bhF^d0bdYDzeA|ĨEt)r#T-jwH(c&Jp6,XhJ.KbŴls±%Oi\_V LI_'R|8>űjwf`kQS/EנjY6G}/wt,x7΀ϧ^jΣf{J'U1k|ozXF;|oG͖ݾ{$g}5n=ZTUPl{]EK0!ﯧIJ;sLƲ ~?4?O>VHh*+MTȨ$q0㨬ζc# *c*j$?lx|Xa#/Jh߶?j>GO@y~ZEhQeYFIJVjnC+K 1r0EP4b#Ü,C" =D=( B2~y+hbD]k܈:fFtpIߟۊg P 6@[qZ+gsN9v A泉2HaË4%/,߭+? mk[/!1&øM&MDl4ؤ]Sxfgg\[S,^5D5(bqǹRddAhtIҤ/|F#ZSbM̀!"Z"JqLԉ`9ZR%I%rLqG^"FEEE5Ǔ)·hQX$P%s\1o [o{=5Q5(b,0# ~vBSBg1K`*;]yjEC})Gg*9 ͙"s3 # 3,;Cfw!)9p˒":cv} c^pyCø✸%8Myua9,PµqQpg]NUn8Y gi)q( K1#Z#">0f\S7&EEEpqC%S!1QKH%sXQLqL|u=:Wi5y 16q SK+ Uv6}>ܻ&X.CZrvL2Rzb9UDCl*,4e.۾mѭ޹ dl޹`[\l:󹝨dN ͽ N3>&Cm@ʷ{]$l)ػ%dQHK"U]XE),٫m 5T( V.sQ*be sLf13nVlw0D[-hJhm`2'ZnC$ee"aVqǤ`Y^ u./cȨSʡDKT9vWeaxӆiY1Ph|7w4Vwc-ZÌsB]RoAK )P-=KEG/juq\_ۘ6 i ~ͧ*1=/H\"n MZ:W=^0N `7SUdBScAUI฻4mdUC _!xnr0R UK$,c(J#jw%tZyp wgRS.z6=bR"+siCNNT"ءLeȅ.*F=rYGrG6(%ɞQR_v2w/ʻL++)!_i| 5ƩzB5b (!?^hȇ1Y(okO9cUZ*X|΀CsAgde$RF|?rKMC\+420p 0(pL 9|fp4kBhE|0uԖ ~_ua{'?J.j%jUk]҇t2m^ףz1*xGxO:q3t?WO|/ԧ?t`\c X842ISQ1N(#XC9 kMT"t(SiQ;,|jIDۘD 16qHQ1E]R^*a8 :XF QJQAm9B_cJS)*;e XG^q_&)IhK÷{VZgm, j#-ZXt~EgǙ,:qǣ f$6E"EQTY̛%6m!*{c"xrE'G#hVr[QW\/k8ULt>8IToE:F ^ 6j?˱jrP0NPd=!aJZ{Ջ EMrQO+T%x&" IakOU4V0!:]2M&#t[`+[`)>ĿBS1Tԉӄǰ)w9!^sZ6]2FP|ɑnbtŋE7 )>7頫gO@(}v{rS(h]=v?q]JT0{X%;{[ GsiۂZϗ_YvQ\\Tc ҹ< KAAE&DhB%{y*\Z#.;R"rA|HLͽ@?AxA%ϴ$8LnѲVr=ǧ>Pݪ} >*ݷLPJLcBTti/?'ul?ClfZyKCܡ^U,$զٽ4\P#9%>d ޞ{%yk ۶c+zd `UjbQ.Mkբ}/#_T̜U`x>v 3LrҜ @ Cw)QU/*]t5Oqr{{KEujNu*Մj!.gQ{.r5ќL' ө)b%*n%"fO޸͕8 Q&2>&29 2Q;͵Ի#G0-z;>^W\|Ӓ1` ٞszf?pTUvg]*oAW?IO02*ޡU!40@%DfrHRP/ON젗|-o"Ҡ-.1}D'0@!s@jנNK1K/;$zhAA*%plȑ\.hk9"=޿ikvЫFx}Ի%cԴ"khyc}ލdjVO? ǻȧMؔG:~2?iEuޞoͭl9AWA8ʌ'.$g|DƉ4H{>%ԽW{ ΍ZثFRN-CF:Y  }v:ZvP+o~b۟|pRIj : T/r^TI4'*tEg.w`GO'cʚX on͗Wޘ^~?CЊ"r&ob\Xi<} > U|W ТF/Wk9f\XǗ UXn"1~Aze|%,-C(ICk.,hsЀ0^$>#Bw_b7'j2ݬ7,@YX@quJ75(x7 )odb%3a &QjU៫g}#=3#firv:oнmnB7cdhYϧ"[|g'UMZ}f],13Ds4h<(*։&Tnu h% XZh:0792oDBK폄6Ѿnz7&yƦ;dScii7Gi4~*Lztb0yA-ȴhjz8 ML~w\v,jFiX` Xo$ÊIO .k6q N0Cg^ugi߮P||Ǐδ߯w_ r* gYjI!f?ȯaOoy6u]d1יۇ֟痼0f_a"?k{,?OGYIcGw8+}~f;}krʌu!.TknN[S%rw""Lr!Xb/֨wAtLbQ/z~5Jx. D%IQhsJyAY2 X¸ޭoFyaҦ@!>\M=z09oR! (Ysk!T*ΕwsgJ'Vj5V*4Mm&'n~2-5lXY˷v,^ kǟ+ƃM«U>ku2Fbk6o,|<,,%Aq5.r;ZMx32ZSdSZwg0$M6`*O#]_icx GwY}!k珎~ȵF~D<8t|(7c%,2'q &)h0 8Ƽ*+pQgdXI‹cJ%I)ScG4FciBml*V9};*Y#osžGF\yˢӊ;hcr$a˔*I X^d+ !~S#T#o:d@4^c/t,!Z+%H\YUTY1D)aԀYR2zAMR!rXN|} cs @@@zAm2vt3Kܣɨ ^~#(R\\0̓'P(#T91,bNyĂ&7{;WvKv<߫y8M j-~". N V#Z9d#wz}M~(E%;e7n2EO tjXsҁ,O,s,`Ix&cJi)%e@b%5\l*lTGάwWqC( oK<3bT\{bs8m8 KY4s`oJ|Jb-m:lTά#[.b8oWn4ُ@P g{os:-?Y>Nq R]eEis*15'?IJT*~ѻ& ;9le* ЛwzZ?KLgP; >GPR3>5cY~t䦹gk8. gDa6SwD&7_FɻYL aU9`q\5Ŗas7a)gΕ[%X|q"ށ1h?]Q$".#4I05Eey̾5e\ւV\ZQ#cJ47{X>c;?Gro qsS¾?? $̗dƤ#VX) .>Xb*IjG;܃ZZm72o#LDTI)`1'} wu9h%ν}xsxix>`eպC dE`SBLdi8z|xw?M34%=!Y]/_|/s,5=wAli%m\^a@l=!@4J (# # blYDl1 %c Vhܢf)֞[`51L}c!Y!ru D%8ŷw*>ZW% KLud Q1؊ĺٞ=q ~]ݒOeذz=ypz1ɈPRݏ#we=7,T]i5=%ŸId$'mXȁ1F rЂqzmW$ !`jur!4t"aWX1ׁg9^K6$>*G]#('g=)H'karĺnG|~z͒,.5:20(|p[|^zel-UK*)~Bۙz%Çk;èE0'cl _"\xlEEGTFɅfQZR~p6EՙgNܶ(ǒ&{S(iɤI=k㸑EhpdE!ApE.IQ3[=+ʲ--=vfw=6UAV 07 )_a $#"J&;=r 3b@G8Dbm'[(iPH3]hls@Z)2ߚIS2.q Ŏg( *$=EVI) c@%*><8QN+)rh E@cX\MioxP~̛<0^ܒ"G ]oc ӸqGf:{=J\>^FZ?Lwrô_W^]btQjY5*kOɵgԚJk K\^hZjB[Ծu{]C֘y6 Y]#{&Y QdYYtF oUbkz_)?:\J0=LW0^Tfw̘O:=yzĤ/T(>C5`m2?"h 9j:N4 U x#1>rlh,Xw TmY#l$5c5`I[ۆ$\˚+I*0Ph +d.[Y;!w}_:n=UA壂˥?:}z趃}i5ZYtໍ+f54}7u[~T=mȖoj.'Xjm6gdaw̖|,2hcwK$(zo q^*>1K|z\&'082 YrDدdx~?~W{O䛼j/V4 B t-G-cqm'@lPK|T֘†H *a í<ϙu1&^FHTԌX{)\# i #Y-)sT_:R] `Z625=aQg(^ :u7w,i1{\Dp? *!GАl:6d;w!>uM::=W@w'ή/f)NOq}Cs2.aiE_[B=|]rZ*&݄ |Gn:}g65VL~)3)*! ?m_s2EPBdGHIaoH>Y}9ѯlVo̾=CϾY%6?9sGBqXq`˱ǎ'z٥֒K)lLk(d >9=kE߬2df9A<ԫ)u1Y_? m%=*qqB8_\ ݟyQ #nDV3r-c$YBJ{C.vǭ,96I֫\ >gbp4.%@N!\Xc'/eMnWM?%HoY[ w{?Ee`96KJ ᩖ:F*}X{>|ͩ]5$%5B+nʍ9؞&Fcwv҂X y+R G(OL\1kK opU(;o|ǴYKPԓU7hѮϐ+O!n5w9Ha(hFGbLMLc9G*?[I?$/El;s 79$Bm}5NJ sԅ>^_a W+zYKgUj^raB\Ya#)L;)xJ VJ ,^Ң,2>MGmSdٗÚA*MwXE>:el?ե.,6`F}>jOtYY.yPա!B}Ћ[XE+k@0h}XQ <Vqu+$;.0#z7l=^z=朦ue"_U KCoԷ[BS-Gm<+)vtX:MҼZGv)k +"X\_MxP9X\ΈpnO$eyx?Z/A9FFM?"їU*6y n c}ː1C:ArWS)pҡ%$h3iW'/sؗU>1}p Q;.;(_M B虴)if^P7I$ZfDUJ|ϋQޡp}IG}؀f3<`%xѝ4UM%-q"BШD fxgЯY68؎zHLmWM;.|4+1Mpͤ^W$)e95-`Idg kf'MCB.:6.}%lhIh>g;S(2ǯpuNL;r%}6Uz`5_zӃwJu 6#"HbYnF!yU:)%;";j4ěR1zYѱc7_3 vv;FQN1=ppt#w8n*:FݥNcY\ɾ9&嚉B( =Pb [ G//a_3ZZmzA|%G-sMgWowϠ_0Ł{7 >y ͌r>ynX.P1eњSdoxOSh O *k K 3; AL̠ _Շ;qi}1xm7yNM_N3x̜ThLę:rOD-kqܞ;v}y\y[O2o$<2C?QܜnEѭ8WE%F\ab,E(2$ϋ C{ɊXRݵ1$f ^snҳ\#W!!D༏Mi(F&dgP/D0Tbd$b 61Ճk gH!L#8ge])ta}L2[z_ LDsIЗrKyhg<ϧUS9@qjj_dz_|wq^unzujQFY.ɕ6ґ8δ3y.XaS@ArXtѫ8/6ѵ$mf.34a0]|,,1O "6:DXGvT1Z(,'AqĊcW;Ϳ̦6uOl([SwjYwRjв؞OoTοLֈcj Hp.  КJ^\Zmp`o%yN/H..dt%JzEݖ/iA˺v?+1qXaaދӠ$X27Hքb-rdY!@zi{A8)$3.8RJ0)3 [u^1ZZfE`=BžXd ɘ 46 &a.w\K[a$. ^0]n$e'J[p2$mA"Bֻ <>3k` HZ59rV^X1Z_qHW;Ȁp쏓plNN= c^`*WgO}NnWY}U(rT~]aon# R(Jз/"++,zcM/׿fP<.B~B !Fn 'w^-<1qϫ*{)UhEGIGtj/75nwS@ @ +?ʘ~ R4g`pA91a-! ~µFx$`J̗B~e(0^"T%.gD8 e~Ӆ6m8я񅏳KT7@GJfZ(C =#A~I7I3] 0E8V,{ f9آ60{~wWOYXeuUGHieJJ*(~_-:Gwm$_)$Il"ӢKdS5=H%]MV57AEwTcv|FXI r4j \(I$MD6rcmatU.#IE굖 0琋 4aEDd:a W šD,z~c\֋iD 鮇cVAό(WId 1[:Ƹ8f ;cx) 4q.䜐HD 4KQ\!cH'`pNPZ#昏=k2@!Z*oMbL "&30UM[q߻5bQX9@>$OZRh #jYg@k(`N xK`^7!v&Z.?-CƐ||&'Ns0r\aa'gI[2Fju`nX_ڎ(5;o9E0U-0)يdƈ 9g1ip &68P-!drL)h g.neeU7c!FwY.*pYq0m186Q1'i,(rp+# ~JBϋs/|5~JxҀ"s֛= :ȥH|MSO>qEi`=<Cy)C%%A\)ᇳL(nd ښ"ovKR"^Po 75S+OMozeU9Qǎz~c\t)O":ilC@dZ0 i+bW,q@.z٤Bpc3n@g|m!,We*<fbL1h ȡ~xN'q Z/EGtǏq[OU^\,E&Eljqp~}* ~6/uJ*s[%JHPr=Kip]}9'(id'V֓vU|cH*j'lV@(!nA+$sS aGjPUw|Octć}3:e Ng BO6 ^Po #A 7asjGsl(Ipү xSSʣBGt\79H­rFMRƃp s \kIf.{m[a^,*oh =_r-ICۯ6ơuK˜,&[b1nMۅv=p*6ULfmg!9"28oO +c):tM*%|]ܲx58Asr%/ YN[;JvUj`HTB 6f+{dW&i*$c i':Eʼei9S#9u.IIa$I6c1wyA- B9Q9i3 ޠ獧1\E.@虶`1QƨiuMRpMIBrCH[|:)%?'Vy͍Kzh Nç%;◈ȎLaUj]K%AdB>hQ PƤq$@Y+/|)yt Er/ĂAo "m -6t2Uщ{GC%꟢g4U!EE6{x4ǘl 0YHn1>-!%=<#*tZlFhMjc *L'b5,٭h rKsww;g`F2ic$^җ<ڂvH/Q#h7oװvGcp$4J@z[*x[U!84ldYW xjkAsCH=i̦Okڎ\/?ZȊ a=(к–lb5($ᵏC[l &+U6YCtՙ߰HcQ`hLD\4MNcWsiS!i d7n+Փ4\E!ҠCL}0Dh&VTzNdɼ([fu>}v$mpG;F;\{zIS1H!\Dv h Nck〱%}zx4\$>RLȄR!`1yN11BCHb’r4_;31x=o);ݵcȼ$">+O<c}S:bt(P5C@5aiи`9(G(ʩ ^бsEnXc UbNC ./]ԉ`}iТGuOH$>pOYg[pD@`4]|F%;mtuiyT&_B2ۻ903/8[hոr'@z/씮kBDe2YCK6&I@= 7s^277Ǎ  i䜷 Q} 0P"zx/~FThU^fl6>m 3>fdE;&=<S>˜\ 8(~pXF-Ř} _|@" 捬;,zUFJv#UmG]/Fd+t}՗e B1"F]`eh,Q ./q%E%1)gl@bQrTf8NsR(#l6FqS K?Gopɯ +It.2ܢ(QF!<"4ݪy>+n$l*$μ?'ͬ@GcpV4tT!?';`룇Gcp*xnpχ_nceDb@D,[ ,0.0fmUy_( (Jt*A?2dO=<SaHQRn7v{5,0GcpE%t P}RߢԱjNF϶,^FʶD>Ò=<S^? +g$ 3&Lm4><8-2 BWurj"d|!["Cqvwy7*JsRhƓVK<7ƅ60aU3g 䘨x/݀@NIP.ޘ+XEt:Y_xh "nTjf%&`4$sE}Wzx4F˷SNzKLu0tn}*XtؕAuqo A0?߮d8VW:ixj{ã18 2_ [4ުGcp4?qfں)N >߶8{}| !mWe_s/jgd]bn+(Һ:\MÿOoҁv˯o~mjN~/:Fmf@8s xL:_%0h>S~^-\0tWVvb*(J?Xo|M<|}Y>QymaKgW.~?́NbhpG;r/O]qsLufia/eK(/;.r{Jbx,7G78;b9yZxf?xZNP3H]0!IQi)R+OE&T<G~mM~ I=@uD!޾So[`u}m Rj}fƿ8Y1J[wӉ4v \"n{uXw?|HPUǸN= ض?VPN p\Z"K+O68Fckmx>׉Ut:veU%(*dW`W>^h ,w01k~\*I~(UY,֜ŷe>Yݷ}hRܢ];D0/*$^w:6؅ur<$مK>^$+yX}mBH=&ݗui?#W?B]\ T?(ŦL(@4tחC$W@YqqkõRl牱#OA%2:S4s{e8,9hԤիMf)XwbsK&Þߩ)xxTYK,r.0";Q >2^,WO*Jz|U=ɲY@Aّoo~7~1)S|>vIo~o4~2Sx&<<#uQaw-H4|\8lI-r>gQdd' ߯zX%wKlkA[dY*eN Ba-@2j3|+Nd!_gU쩣\bze- YK{L]O2x:pJQw>h}_  -"W7ξQj?ɞGZ`r8\H-\p.<4.YDOct8u"\l+}={zi6dHI&Jy֒mDQ8T3 u Q{H'|M{{cȲ0ϟBtZMɂmOQ"Yl8>Jat~b!c8rhDdܻ,J bBKR'I?ZJC,Y(eU}i%r@nύse!>+XG~]Vr!v5V~JOJ'HXf5@F3>k S&DSb+3V)Hܢkh}>2p6MhNk@ۅ9yIy qh5H&|D+B (Tl#MЗ/?ߜLBZ9=yɍɹ_㱝Ǹx2o)v \"QI$ʵ%a\/tꯣFg+c66h8VK +>)F=NsD-Q4]whn/Q ۳r8o[E??5z.6N>lV@# "z5YaJP_ɓYm`9A:Xkl֐Sim{EP]8.g1md~fonVăHr~,k6cdN`9ZFQC^p=XMv"PTdɴ"OGi =;o~k1 |5SA-P{E!XqPg H90kL_Gߴ4~>}m;fki ~f4O9v|Df'|w)$ 'lEƳFhN,e *rwh# R暣1ʁPH^kaR+&g\/;sm,=fs/%smKZ^][i4 5u9G)=wT(2RuQH&1% hӪi:('5*mB͘`"v.BȬ$ʙc dO,kU#gC@[UK%QKn%75_ҀE#P &*t4йwN"X>fz2 n"t tuXf6+-Nc\1j-KՔk)#A|e%f4"HcQuEڥ42e*kxhj肴)P"""rѩ&FMAK#[_n-ǟho>Qu'8 |yu u'TcJ}9=lnD7l|9׾mpjam.wťϧ zvHX֙VY9Fخ8T|0ڍ+i' !ĉ:9ý`մsrp\&l!2*:plqJeݫbń^Uj~<H]F6\kn?aspMLmӦb.cuS?55 p2mYx1]=e]|'YW]>\s5TӨ,S'Rjcپ2eQQ QA3bL{!DWQ"l 7 ㌲Vz#%=/@Od) 9#|Krz/ =ׇ}nr\Q'Ձ<=)=Ո*w:~.[%Zk}寓, 9ke ?}P r%ȊXT.TRwL#d< o~a`\F0 7ZH*JihtY$ʙrULY)*t H>3}(q_[ո&~vFNpS /qyX]׻pHn[;Be{E*+}Ns㰽ߍx}HWOKSLtвs]&TR K}c[hu'Ro/퍦˅؆:b "*kdM'>]Ozt\qY,YlZjw|}iF:o.}f\uhՓ) )9^i.lBa"U\['$>j䷉R/^O[-ldڼ,ktU .AsgC"!hYG- PFXdoq3@a4lɖy:{xovsO6 ,^zio:V ?X w<&ix|/|^fUFҞVYzm ɲ]Ͷ9ϡ,S 9ӖZkj{zJUiƪi] U׫g!jܨw09ՙ<#myZzNh}=Yñ>MRVLL%ǵg]RܦI8e'%CCE_Xip Κ/w)ii{? ۞>MԱ=J=[MlOo_۳=kkLP 3a  vhr%;B-BW ˖}|)|]'\Weɜ#`$RzF\_9:(& cgvx>]JitrH;c*g/ 0/Ti┈['4dn2~ K áVʨᡎ :{T@uy;\Ci!!>#>b^.@JHB~M,٠&.>B*iQMӭm+aD]e^_mF6D` R +*ҡXRP Xfjt?'_X_!d帢iwO}UE}ʧl@(vDLM^[:yd7"Kl`PEDj YF+`0ޕv+h[ޚ[]?h{4D}_⹬J9u@P8pZr&Po;T+Tq5B5U2+$+䊃QWZ/]]*8W4-E'[EArVUs9M7'QCTD̈d0I)YͤŸM.5wg}Z8}@ͼ rɄ! "%ck!D=Z<B|4jv&Ј7d>J=2|*0/ZaEvd>}$FSJ|*0\QyV51tJghT533ߧ +)Ug$%1$TWJWq9Cd`G 5UpdGM6EO $$<`Q;B 浴9 r 2/Tc]׉̵*jNC43mlDk[ F)u"=PAWk 68C)#Rg)MD$.|j0oza^PU@"ī.i*=|*0o6,;c !R%zMhvyp٘pڧ̗U_ycr@KcPh+mx!}Z14FHеy˜Պ$n+7d=|vgޠЫL,KDӜQ#Ξ dhR퐂Nb<+J$ &[F5Hs=ZP<$?)IJ*/\f-`8mʁW"uVFZ.][}Z:ΰ%ۄCCfz-t>-T`#|9d RFy ^@ءPy|2GAX6e IЪP]f]3ߣ K6kyӑ8(x!`O XBixoԑÿ`G 5ZVgM+cDFμΜP}S.{Pyŕ̠'&:E DKF4hnj:L2( H҂f}v0ߣ̣Ak3CP:uVtMmҨih6] G Lv@HymZ\ Z /}׽Pɏqh {Fn$܇;`9apwH{%v8XdOf+eݶDxT5XoT(.1+QS8IU22k(\>@h=u@N/9kc, ,0X!Z kl0Yd͙`1Y`"cV' 5B 9PRd:ZPzs2Bv2!EnS@Ak72}Ɇ@hW|$ʼnhfHQ3Z j-W-!G>`W?F}Y2#V$^ 6k2Ī%) '-x )Syj1Hdi:0oUJdQ&Շ|?ټ ,P Y ~Y4ik{?F@~uo~*ge6yW"BQ t~E֚d8:@L.z!VD`y%lqwp>wMk?\]E'1"hs!M o6![S H[{EVbuɹU1 Rp=+?B kJkkHTJ9@h|h *MQ[" W<Eu=2?B INdX5|E'|X*!Z O@HDȉ\+ Ȟ6 ^/ (! _n]v$t8~^8paBG@hu5FJ54uZKFZ @h֢9;E jZd,3^#C @ީ5Ak )J̣ C\4jPZ꜒ ɒDxFgI(!Z /jP䓲Xj%'5MD̑#ԣ@hrc 1]k&#O Qp =y! h5,O4״-#>Ή|Մ'5B ^qVw5S=3ăMD=C @$yٓ“RD"eҀ*o ycJphxiUSkuֿ:;}O%8@QNh+C1O.)L&' z J?w56?~L&ӳ}?o+\Otf̶qgsMl̸<=Kz-˿}C|[Pf}$,J]If3$b韖B횟\=..bt΋<2AvNdgsPEē \~"{<(jſNZ ^\4)ڨ"$T1>ŏއ\r5ѕB`5%ǑUiIr. E"p6yQ@y10cp"#!iGu'ji23{eb%SP"22|`:Rv,h$I9bM^_ (I0c $ 4I.P d$IHj\1DI>bOZ+D,<6 *.:EBjRZE`tC.]",xX;g3eCNֹ& a!NeQj-UtNja2şC[3<UB!HZ?2ҏΖ(&)BWQG)\IY4::HLe[Mo7]rSC_l(y뀗x4%9v1ax >ɪX7:G3d-hx @E㹬kE.9t p2UhbP>u1(o.wjny~}.nBGS(RLx7ƢMڐ E! GȵqqD 'c8O:b$2Xeˋ &W=}ca!GtN\ 6H|)^边A1DT?o!w79堿wj2|2?M5ݬ|?aLt=}Ãh^?frůVG^jJlfWWZ?߼~^~Kl>?faF-MT4?0Z\]8\}$-/ ɯGM',. rPF7 pncVȘ92I݌D~\ &J&(K}5s]*H-O^ddGXtR gA XYkD-H_&^?ڒed9N'L2idɩG8ttpgz% sdrRPr2 Ɖs!+F$d蝲G% H2*d@+"%Њ$W? *I;w$ZgYERTAIweHnS2kIysӷ.bZq8}<zTJ&iLK-]vܼLq /BF,5#I._\vI2HI0Os'>I:̂uO,L廆K^-N5}zIm6Ri`#&ڶqUWl>oqqr/IOnܮ65ilw$JZw_Vc5Qts?@9q3Oae>ʜlNw<kxyބgR=wFcZgM3`4 bc/;<^ŏhmW/@Oҿf͢Y;*9<1;?-y)WVdF.Y BK*"Ig;Oh9k4VLc'T%>~_^z9 \z7+t<M+KZ[(`xn?0]=0r#*uFqfwkOkxW ufvi_|K1 _zQBVkAJ6u@[ XH눦 ׵96:K(ɮ@o)TقR#Np#L%{n_M68΍u͋U5jzUcyTaԘ5I6D"pQ{(1*KppH?OjC0ݓW!TIr>' ݄zu2H[X8l EN*I#H]>nMBKdƺd n5.pn4q}2[_5m&jWM4%ٵz2?{WHP6oh,3Xl;0xTJ%%^`갤e%;ծrf*dF`D|1Y@{XY^8<5j]^LcD 3%La|8'+η.%#,>B)Z{VT5sd!XJ| l1&yn }VP{%VVmA+(+@l 5X1Dl6妠0xW߻ZJ{]S!|zF~9lQokc;z $m(FJ[V%! 'q،% N@"b3һvp ARZZ `Cϑ /#uAC^HXid'kWB*.TQR,ܠGVB`EKʹbT.pBrbV,8m @c?s UP8|U *#8D`'hU9B\?CrMڤ7#-ȗކ WdCi[ES V ҅{-l'k+I'-rшoj7ŻL}Yd/'I^˶8k96ekQH^Vb!qQIE Bf=uOj#U]&G+i{;pf.ڤaoV\p 8r$"DzeRU)G$會'Jm4qfքDjrزk&}O5q;~K eՅV5Dhmfz#H,4śoF4F>彟Tjs] Q~g w S9ғ{<'x>g0G4q?+~XAҰpsy`Y8V{d=HҢ_;xnŧlsoI:Kh" 1J@.em:]SQD%/2& !L(C(G*ni+I( A-R_?>T` ` ~ȣ]Wתa2HW? ZAO?GF7_G[ZhqLvg+{*d{j_E %4}o[pΆ=mB+ GB+sv(K|kJF r[ϴuatX$1 ,4Pqe;\I.iY{qƞiI?z^,E$?\ܣ;oѫ:#t pn&Ϟ>ޒ|:LiS{Dm+3|?Wӿn0Q " 2"g^y&6swqT?| Bhmqcpm\ÿk %U2 oޢ(=+T(`Mͦroo!Y1T~ u2i2[ n1gשroBw fK[Vo}s{sk"ܛ6ȝC6caGw{}O4 ? vv W tlЪt.vp燋[4My0q fgr̓tXVC_G8fkStZQIܡs{ǿOYhtf_ew8^)A^ku[gaef0[tZ3 1fdXmrF]#)-Ӄ>vUO ٻ1h *$Ad sA]<ض7)}Nbʲh C%U6E i jɄ ƛ @_|!ͷcȗ#:x2Kw0tW1tt}u7\h{gGܣE 60kpRȘ3)΋m &ΓYc9: ];V߮-~B!S7/@k9<ݹO~8MLM 6 δy-w{Mn_)-@܍_uSo[ԅEfJ%4|h A&B I#/ZT:r].J ,9j˚TG˵a_xH-=Tik[X®Sc'T O "(!,ɭ)Ks:A `fQ?˻iwz/~# /:^ӡ\  קRV*g DyG|e(yppu72+FVe2|]=D g;y e3@0@]r>!gFѹM٥8jet9sT;1yP_u{0ϳ3 __0Q4}~ryHS.i:?) <]  uLK0Lp. R{{ݫQ~=xX}]ؘE/gDy4K=.op<<5yz_D_dNݳbdD|>$1#o!&PEY *TQ ABWE YàEv\IZeŷ}IBǗ\TFJS=oHDD.}qJdĔU4l?TYlDoHIXD2š׮kf lI㘞<f}?Lo ;ѫM2;K>m* $8LT1/ @kaoUiXThR*Rf+k8zGV*Zr1ڒJ"Kh bemBGB~buױWceN!')2+L]rLRp/1JR@8:eӲ״^zM;]A)Ruts}maY p{s}>'+>ysGse`tbnWk,o4ca~n %% C i,OQqSv-lJɟ:EZe|)SDz=#Ѩfm-S[@].y6FZ.KaƬWmp㸠t.ղPIcTwFi !2幗to/?eXHp%aԉ`  ,j="PEu汔&6o|3i'4d sy߾oYi4VWs6luI(uO0M9ɢC ,PoNZVx ^. DP,(Bkx)}jark_h8-C?Lē[=V- Z0@}yC*Xf(%˼Qʸ-{Bpi05e'#YJQ==O 1\=O˛ 줨=xgS&C!pn(6&Pt"ĭ&BQsXIB7tZ%b#(c/WZytq2X믊,yf9ey{3ߗLX_6#΍s>uʴoy>,iWhVROP9F/w}H62ٵc}Qu!ZXndNrl@-@aP%))JTEk㭍%()q Mۨ8mKs&J`\HwYHY)UC!(uphZI{VV VEX &j8,H+@"794"Fz]d r˗,÷y}P\o=)bQRv  < ‡Yza|*`tB5#5'[N(64\3pz4֣㠱5} u9UxIz4JO'j}e(zzFuc-=^}4Hh.6@\'٢isbх<%o_\1H IbʋSQ t !iz|5yzJiE4xKH `@LU I QIlm@ 0G'l^ImJ^ _?>cu`U+0P4~Zj+W?}7}|mIʮcsS׌5A{\’9DŽ}k~ܥ.[mm9o9#;eZg,-j+VA}Pj^ ü D.=a VR<`ӦFky^bRgrxk{ښ=sk1}uX; Z暾=Pgi]A;;lFW[di ș]( V(hyJ8'-#I&A2CG<Q/^:hkE='trFM_^vp!+Iueؕ, ^ZiXA1%s3?>_j.ZgF;u'gN9+]v fLnᑸ/m\hQ$Y&ɤ;t'dҝLIw2N&ɤ;t'ta̤;t'dҝLIw2N&t'L4dɤ;/2N&ɤ;t'dҝLIw2N&ɤ;t'dҝLL2N&ɤ;t'dҝLIw92Y"XdbY,2Ef,A0f`L}-I'?i1a`} :Rt%6FVbH0_+fqSl:._1(aJƣ)TFd3C:nJ;xx-|;e$:u/* ezz#/$3AzRC9,JűA-[LH'y'gdi~sk6ya|d9_T JLR۲`EKӿ.(eeg\Y1b>byYC#+VVVОNTs´dJzu줥.vm6Bն K: xQCg'HHoӏ.*FHJ44b2h2UY|qYNfJu(͛evvL5w,JK:)u􆂩2~=mFF4Qiy0VT.M'=w5f$DI6k;;Zs7P>3+jOvmt7񳖬 1 J4 먵 F"*&# K7Q=f<8&=<|&&B$ ƽşsLRHx8^JO.nnaYEp\naJInRT+60t OhڮMIǏ7o+vڷE֯QPAͫ+켁< [=:,`6z9\lۼ:{TpA%4ǧ6"-ci()uޚ‘6F>%ݷH(r#YQ$EKZ01.p( J)bR:2F0K 6^;9sO#B[@ $bIt3l#!O0u]g&Xԇ]oz<6V^lvA,tl==ƻVEe9,P2Ъ}Ѝ0N@9!"-pg\.#N\[ђ-}p@,8jhL |4Rbu^"6jXG(%iQ6"n%ֈĖh:2,2 B4H-8Q0t p)Mǭ3q AU{k':ģ`dK1&-PYpcyqfM.a:c#bHc0lc/|A"֏2UR #A0)5"rA{D Z$%"8ԙ!q!杣-'04FXQJ"EmNSż'P Daxă0l~tػ!vFvxl{:g[8[Jk܇˥7oz\s҂,,PO CfQ)g`vLLBHi$נBBYކrd2qmu6681cF`0B4DGL9l[L (p) hChІ1nU0D7 1 .?=lmc Kj8g."eUn{M|aaL0C<&3hYzF-n΃Pq?zA0VIHiV%,ATRQĩZs aћ$d]?q:t 'Wdb3Av߮!ìQ\a5)ʨ3"P%FGXhN^hl7|p38V!HJFL0J+R9!&xQwߠdǹLW}v0V_w+?0 XI.~R)}[}jM伵2>$^JBl> }"LDYR`>9E 8Z6ñV;!6Wn-a˗VnWW,uMԛMVP7Yd&"DDCLyFi"f1ˈ5\FjFn潅[R~*Y^{V1OhJ^EUJ!ccO)4ᑨu0_6[L ДMPa#Iq TTb:㑪 RUe)q:mѢmėnPNǟoFmXmƀk7*7ZPk+;`<ڶˏԼ|wq <ƣцn穞cngwu[<8XPf痾zCLbh?~iػv:m'evax g!ށށ4aLr]YJD"Ou!)Q[_B 4vT\ql2AeK0v֣ђzI|bPit0G\lݟ|0Cy<>BX>b%B'h!d80Kxʤw) 'C, 4Co ޏzdmW 8Qֻ6Vن*E;;Vw:ԋ,'竺Et|D~&Dc#+/ XfE#|[} o} 3O8nÌi뼎@ N HFbWe} 0>/LJBR~sRMb"Jb0瀄9!Q$DَgX|ѐ`Lv!#Cإ~Wn杼7afƠܿO|{5h-ڈ%L>%0Ck")@kO0`a=ss5GĉH WЩ߼^z} Y̤'lx= ~77f:Tq_lh XDKdQ#h.&Mh?~VtwCX(bI?Pn2w~n5^7/7^.QsWy1$Q_Iuk*Ltn[E֭¡ALQE_b.PU߄cGh Х*:a\D;~t;ۥQyikU_钎{Ջ 5ZXmEkK[ @gWT5;Je[taK5Osmt8[ GK?.W b&N '^codV)Hd'd ׀EQp& @ S49- ,1OF\M`-baNPLAhRYNZY58jf?PTt'^UTQ]E8!<&)Fs!2/2HLF[u.̓F)a g> BEKYmqvEN?dnԜ5< P^:@dscty/QI0Ze0HIkB;/Kp+MK{vU aqaRQ.e4R ;e [ڄo {}ݮhv{NY >1ߺy`SiJE4D^pbL[g5Y }]3Ў]( lq(n"'M8bfm(J4? pAqLgw\(iFH) %!95ƀJD'V[Kg;V4@b >[ʝOP`lOߴپU~SÏjG|WaC !̌,H+&H38W#B5K|{"pQ|~eK_#?ts?j rO98'.UGlߟ~~?iGBZZ@{u^*?/ܽbɿ\rpWxz ! ycXQŴvi{zS٭Qn~c-[S4(lc0ub;xKƅבd +Kb-˨\,cJDc9!DQrƙa'%cDø_L7Gm%CkVfδ|js!"E9>tl=M|"may&C\r{, ,':N\J (I%2Js' FJ,S$8Q6I[.)hE$0a0UI;QP璎)ÂEi5$iqRt kzhjIyZZr-XQ"'>²X"f*hBXqo-`\HGIaByE8\|[iIv|hKDc%tkhR,5N0&Q+ 4&PX ؑ^nͥ?l^:u׻i!"&\D7P Kgdy-H@xK T"nEVI!Ĝ W]KXN{iG˞;>v:D5&* LȘfXxILʼ:qTh 2(/cjqfAuY05I=$̪4+,&N{X[9g+2(t`:<3WlesqϪF3-ڊ̛:"r\z8GA9Ch D@ÿC3P8()|C@uIKR2ji"xFji wsܣ}P]{V.hf,a+QÙQC΁ `h w%H.e///obu + .V3IrzIb|YUwg{߫M߼n-Y?->@*qk k:j YZFWƥG<d35&ԘF9Gߍz e3j X NDFQ:c=xt9`/3\vxEpB5bBO0–rs̤0独V1I sCU˞U#KC..q17ID;;R ̝jVPLFN4.W7J F|m]Xd4o1C3J$RRJ ޾'aHjzerXeҺ:+]vFd;yNA,zYZcHz,tq.%˚O/wѵUymQ&.gdb+0gcbgq٘YZs&6HIHMlmaT/ ٨,碮La4QWc׋Kx WUW=zPWWW;w4jɶzm}k~ժ2;D0Q'McR fbD&9zBOpM;Gx>bα @ \LfYPsOh^c̙fښ6_Ae7giW!:Uz+Jv9f`3HxSO7xD@GK%E`Mw7݃˯h||n-S{xryUO~U:>(7s"maKZTւ}{?en:yM~3ʃ)ĤB:Z $3P 򻜏[.g{+ L"#eJdZg66cNDQDx2OLe: `j%Ʌ4x&`{򤐴hYk'BJM"C kߑcVc` F"/2G8c2XM--E_@&3B0aGXYmVsFǘλBH0- m)B^`` kjrq"w,x5F)GBJbs3`)I!`DR=rN0 e7 zIrEi?{8'3Znn†zuFU+KnpM.xIA`Z¶u88LvKω,g6u6 LjՒ3ucY iSA~V[F0)l0Vi3X`N+1%mj3$SƘ66M|k۬[#\Mc[#J֧[3Ɖaⶬ, M7 ԙII3b$xB?4 .y9 cJpXBDm< I.em' 凴Țt{\]&7[ j25/xu@t_((cg%MZf!c-Pk]L霋o-i<+qC1?"RmWx# 1?.R\oY %vs#UOBf6ܠd / ^L*7F~5Ǧed7(q|`` m[JZ]b8!R438i)M*ɼ=֔G,peYr~RnY)CKE++DkԱT˵>Es4& x \Dk޻BeNUx/&*st(uJcR0exWG "Z!] ]Fɻڰh Z ]ZC>D]]Yƙ2vNWҪ^ ]Wg +>g^joVyjs'wRrgN\ȵ4w"mi4޳62eEAELK]a'tT΃> i;Yr{;$lK JLV  )^(Iԉ,W*(l!B dh'$9J}T~k75Ri^]b6pcR[zv bR&44=fͽqkpŵEk(5\tms4l .%e Qr )7쏮8h r ]!ZyTS+m9a&Dj e~{ȯg#?FkJi_z+gܸ<}^矇o`O}C~ 9u^`ߔ{ bLlpKr~u`&ǔ8 9ycOZSe}A>mm̞ X/Sޜܸ* D}adN-t36gtEcu#[>T\լ 8UW{h^xK' ͽqajtikOH"4:=ǜ$6XS9.XyYY,\[f 7D/ _akYxGo[K!]_LN{2aK70C/Fȅ);\ׅ]]m/96]y/țY[4\"Jij2MrG˄(N09M*︨bq>vOi)) 0lqMiC*ҩSܧJg&2WTޅջdY.`U~+O-15sȩu9m* Iu" k '-^.|6yեE]=fw7 4-a.e{tIýein*^d[ER6v+GEgNGkCi.Ź#` h~:sƴn5WC&-ˀVT(iw)j*!o5L:IJ'S7UVT`&0K7 OB}3'SfC{~-vC-ȇrBW!oo>pp],]?:[.|zZm4+^Eكhv+쥒knCK0tz`uG@̴VY+k;ul:;7+-Jۙv4T0,xI,0fGBWWX Zqt(2N0BTNbxNCRBWV}9DiS^]險gD%߻jgFpVF@Wg GAW]鎮vV`cr\gU[*[VmxK I?{vHڈoݒ7-/Hz`M}TGԇp t3Bԇ(+GwSL}LbnDtUnEc k@r\e EaA~.*͉R:VBW3&,{ʙl#BMyRp{@U{J;rQ~Z ]M8zLvtut5O몃ݛ7d~Eύo:W;X ZSлԏ$/>>^@]٪`-&&v];xm Q!ƶ"m 77~tcIQd4[ͰrP@t:W X<ϩO A$5$iy`>RteESUqC==-#OtPW-mTt'(O;/z쁘`!FᦇJO<˙Rs QO+ uy!c0^맡DA@,oRPEGI!Z7IO驛1Ȍ]_mzJ%DXRT^i_ϢY[7% g-uC-¬(Q)un4%BjkݒW^^J\; ;q"㏷["Ii]k|~6׮>z /R4h >A|Ģ|^ ƈYXp8wX5d3NEJY[I jH,^J,L]MhWe9" 2mOS2;q "(Tl߶f%?Q;GwwY]v/ad)<&_-~L)zyHTz ( *rG.HMEfK$' H)W%!(;ˤsA"3O$6Leʇu!VZI2_޴%[m C^$Mx㵔-ͳ@$ V"TbUm" B7i/vWcɶ޲k>{ '/BYp=x<*P# h*X큇A`F)0WL{F (ڌO#{ 4gTh0l_?&;w vp/ḻ)K_]}GXikrg, _TvbN$!/=^'b4z0({B.ܿ.޺Bz\Ƕ'82 ; ,`jBq go ]6o%yMʆd; ΕM *æ5ŘOk#&3x@93;BL+ݴɧmMFuVx-X&g-Pa.Ėڞ-ExŀRھ5]{SyVy-ϝeƯ.˴o{g~Vc _io-Kα+͘].,Ʀh nv꾏SWer p2e{1p#qj;͚ͽ3 L4[/8GnӍ}*!mM9G(R*8DH\Zr{AA:#`vRn_'9&6 [@I6)Ga d(|=H$ks:`T19$!ҾٲJV-˩#Rf@Kɦ(R ]8y5<#6yhdž?LC~ Fl!Stł.PB^NM2T}H6sЌ"I>JN{GD4}N7Böñʓ-~ SC:*vl/ܟ38(ya5)@!*hT4bxc6%Xڂ7K1kYygmCCh5P"J0 xij|9Œw⪟qLI3ّqIs7 >($\O;0WJ E{hq8SX|@x ʉuP؆ =N)Ƣ5`LN Q%al;A '@xwdSmВt*ObAͮ{6 `8H.%yy^7[6Wgr j;5801iH\]9* [4DTCTnqP;>Vs" NIס2=ltiғ(M38Nqn!`8 9?S) S5LxXk&ټB넹Rõ6@ݳٽ:%i#N/UGi4"mR5QE'z-S)8m0 @I 91 K`p!JLƯX*`<H 4Z2&xXsD,/[u#7;_l̛M);Md`J!#&B 8#E#-yʤLh4%1VAhl-I6IB$j)XI M޺{Mכ܁Y&_Qdmd{z?OC?Gߌd2 (Y=l!˗wO_{F_&og?$2LqENP(JRnm?nuNQ`mzJ 5s&YoO"ö߸:H۽rƇQ׿/gYPHjkXm{y:u,%Z(o@)jhΙ7O~L(iaAӗLbqhT\?onI_)k9SJ\eg-2!$NC($rgpyY3bEV!>$C<5c촊uY]) l 6nZ(̱P*4wg1BCB\!zF`|UE:/~KtDFIq*܍R%X% D"ĚhjpJwkG=rym=ꬫLحYg-2Ð!1\J(E"q-:7/EM~ƽ6}6ٮl)Iy h)h,@tQcC*V- }&&Z$S˸oq쎛JJUvD1TTc,A<ƹq-N(:d#l2цj%YfwAD8Nh xǾ}+EkcȜe|M[jNS:6 PQb"@qhD =pdW+J蹌W[ zU.~y{QPmVJΉ54 ((M#hwX>sO:}|[p["ʊwG7Soc5jwH`՝6 !)C"RX7]T6Y6v0Ƿn&po%o?3[T%c=SGRqɞ#>kp$CE!CNyjDIÜTjץO&?z0u8?Cx?j; 7>9w\Bu"Щ<%U-V-<Julx4'ƽo 殺tT^xUKyal2=*IYeN9]"She *7 \^/J y?*8{qNJ!Y磜0lE ̝SJTM<<ǭͥ<Cgv նW)EXP_}msc7!. T5wk^/,"D[NϼxZ[}jK\0!&/2)h>TNf!>{E] b| t<;%Q# fZ] caLf|}&DU0ŠqMS+'IP8(4MwujXP+qG"5Mg^ :aFw?*- |4%uv{ !sh /4=b6*oQ OeˠM7lva2^ 6VD=j+3\лl.GXPAߤU}%OY4TQǚs0a!%IbF<%f~,[+] Dzg5*NlOO3/޴@ =T n)ɖlכ3fؿ_P|Gg%r>x>lD ZAQ6=N->Q;1*њK#1^3% c\:oB$ÐI20cT-IVpC ~>ˇP*z'fd}|Y|T G +{+ O&IiBd泖y$QA5yK*BryO?8J){Sk $G}1ЀGIU6Asf<t;цRn'gwqIcR녦&Zr+mIE0M6e} 6=E,l>d'O[mZEheJ&Ht,S} #a9%"u6<fYRJ/i(MC #I0GsY)Dx$u)T@ʞ&25;{xߺEW4%{ݕ7FKAy[Mj ޣuÁ.СL+K+SFL(h[P 󤚢M6%4Qv"0Lۯ¥0M&KfiJ\FSi9ɑ;OdKl2 ja96LցepR%Q[ȩͅ2{OU0D˷b\DOaa<f0faૣ!#A}!!!Fhk}'7-Ɣ\1P,u|Ufax(i/a'~&ۋ5?"EfY0I6?ɓb,9<e#yClף2$K=#TX9hK0ҷ1%.z~C:h lVj[-2u@EU6m+B-0`N.wA) (AIMCLAGBJx,c{\¥~CX((:.}uլq\Ņj'Q:Qh hL|Һ'WK$-h4(-d ~ۚ;rzG풝M0hV{{يjָ8~j[=`Uk\P:efge;wES$^^))iaw !2'4c*djX10.]0TjOl=5+Q*W~Y-]uavn9^Y嫉UGv;lCTGU胵kZ8iSU/w~ ,%W%}*MU6클jYtVu*/OŪX:VO O;\z8+_Uґ/E1k^a5kDCl^h䫯KıUG:E=Q&`j )ǀ!#'YcP?%l]1 c'6.m3R+^V,ist4[?M3۝:=fh+f}Yq"t&7fgF5! 1jk8 =Z%W\w+RyV UGBul<ޮc7el Q_T[}Y*ݣOx$>;'p%;HnC L@ޥ`!Zj袍^8ޭ!!osz(xGC܊i6c!!3~IT\n[,Ʃ  xBqPKs˞-.Bow=6#r<0i#'n؅ER߳{SZa7١E mDVC_t 0,VO}r2͞HQ}|gtu̮SSpI)~ Pm; aGdL]R7'I=ʒ  7`%=Mk{[G51y<4?~cT|X"%= |륰S'X󂃏P8ޑ,[ {ZLxw~()tĨXc =1Ņb)ڐuRrڤnܹzC`Z.`D|^2R1/i:˜oz>7ba|aCӖЧQEvG -{|v{wI< ഭ̴`ٽ۫m4M5=JH fc-l&)nhQ&{V$MN!BIDFj?Oqc-Ӈ DoNwc@@L9a`Lf1X`=ݎ>xf Rl1r LAҽ[g.W`L+&vo11u2;4qLud_l,(mQIGܑcc"Tq '1M]_^)&cϣ"AM_2 0w{W1I\ Qy2ʘ`qN[s9{,e*A~rWGVq*}dƓF$ } hVƁ]1B^u!Tʝ&Cv2!c)!Ŏo.}G_**j>gio$s$Ev+)(QӜ mHĒ PvHFw9 vÃlz}4*̕]j wX4,H߶.{̖7uR#1J%Ҧ=W љS0<%-mb҆Y!"Iy|jzϻ.BSѾCm6 ad,-H(PB% 1J HZ]@†O:2a4HK5D mMV05xwDsƁ)LQܽTK exp"W' $tN)NT"Hh !à?0 b;66W24Hf5;@'h×? ̮\H%h“!z|>08b>e.8;Y1k^]ǟ社Z]jYcz-x7Z򧖒}!>P1h!]<'٬T0M0ӌ}H*Ɖٕ*I<{TP'mɋ3gdZ]^='ἔa")mA/<s(`YSVzcFK%ƛoī#Uh}jZԅe׊f3RD1f cdS0zȢv+> isWsjly*]d+ƈRƛ>91Pz6Z!+VNf1J) dL!A!8sxa/?uU/NbD 2PH@! $6m8 k\(} c&*Eߒ8urk}pmm#5*P,BBa!dceq:Dͪ&]vm0;k0+x<-}qz?=l&L3>xYh1FԬQE*cBgT |.ˢb:QUjKW{wËgBkB-)`68=v~C;6A7V39?ɪ؜b{j]g7P2Rg*C?4e@'}v*cm'^Vn:SH%AⴳV9e%Q?{Q0eAvۡpDBE\Ԭo<̲t5LP&N_yQY[m,vcN.v1ZD 0lʻ k1 frVڪ2d+֫.\u6݇䒝t4*Zb+P-8 B}N$(@ ñRMʤnKʙhsz 7NJE{j^q|5Ǫs.g=(O;c5Tp[PPu',[<̞l56aٞEB;CLɾ*V )z,z_[*fnv*P20K<[.u\1%)!4~ jU |~& Ҙ@ޫ7?f ( rjbr9t7q3J2VU PC@AJ`C"Ի$d|IQOUJ́ɓ])+uiZLrR#0XL@=u?6xULبٟrLZc$xS6s`{Vx9 G ~r5zIÿ`ȖfzS~ƹ(mC@'1 Q(02X䀦QasN rrP4.yLh(2F{wJl ђA+%/ Wлf':OS!s:erPM]Gm*ok`(\{ҋb[s\|L |O25jARb]p ᩔgKwp&n4~*W=֔? %m$.Qy;AW8=uw20]Ďl_Fr8Rn6De0hC%.(nzy}LYpa 011+0Hc "~4XHr+WNثN;i] Y2Zna ȑb eŜe; -UK>.MVaaǎHLR>̰x 6[8en0sc2s{>x㧤^ǪvHsZ;?|IZE2yp$ЬkCB stc~˻+OlaN~ΑNX="C+ʷl-՚n WőԠZ16ULT Z&!c@{*l}vo4hct2U6z']L#́g(¡H2_LKAb}hH56C 7 crDMaHs8b<}^BpUΣ0?fFFD  Q3rLGz+_ 6ďo#9;>Gx.jY݈Ґj.Sn,L sNFNsC f۴HqqI @i 1 :=hpf 6ZS6qͺkpfenjFC!P}zm}K5UȓUA /MƑ@=-<'^^54&r"+ ؿ<^fVC!uΰzSsmޅppQcTMp焑[={kn]ڶ2-AB'V:R4nP:nr,'15)uUZ jjI5&c q)şft,kT>5HLߐtUU1tE1N*xvZ7PNƐc8&fW\N⮩c7咁t񳉾|RRƘ?)(ͻPm? udžAjZC p\_g{Foּ B*y9Y m8zw; % Q.p;rh>E糇i͸SB.2ssMKdip;_Yۿ @"`aa Þ܍}\np01&;CN8]v8 N_o"$װH+&qC][`G*xzC SKUiSV+,,؛7_YC:b@SOq;_?W ĉS\jl\g"cg@)J]T.?0\C_WA.x $/Q@B(Rzi> DC br؜`:Qc.)ZI05=56ntԣ$9NoRMjI;9@a)qI7'4m4k4`( d98Kd9-<)2w:qpL.&S]FZS~ +_DpJBcR3Rjs* 'ά7 -,3RYޙ D'Vx5|!Wkt0dYZh 7 … T$1+0pu$+EʚSփ_ne#GONZmޯ})~.q j*_ik_#%YQcc`I #Pn9 {@tN.eGPLa HUϯJ}d~ "™7Śћ_c9O'k׌ͬݤGL2Sm)ZB Ȕnkd*QL&GX^S-R j=4WI➖(TA͓8cSҸ:gCь kPa=LaؗpL$F׿A֨¤BPO~xJ͓huFn7`r`_NY k f" Y)u*`ftg_QۭP$8#j]Ʀ.I$בr^\ON &"%qMkl s@C5U&=ojWTܯJ3DtgvBO]Lvl7EϹ܂FBRi xPG {Vo?4cw0Rw;C^> { 1[_D*ٽ*(CDaAvF* yl!UuƂй`"$98R J)6J7;a3JnU:S!zL+lo)#NӀ"py7iT,E@ΆںXbtOV{v2g'{f'pz*.=HWcܻ+1N<H# SԨ"\j+/j;`ub$ g! BW[h _UJKRf' ?(ź{@ a`5('nl o;/پ ;A5z",gy03}[35BX=f˷fo۴x6W)3:E~V Zv4︁eMvDT~ttR~G(v^%9giY1b;:8~2Ķu! n75XC`i ӧQy""|j>⸲ӧSB F"S@XB- Ç <֍*/Umnl*aH+\͉MXؚBrĴo3S֨cF>B#aOЌlK~n $}U[`#̤7J^S=@fA??mxϔRR>0r(Q渭Wk"riTjR|QH#Ff-9*mf8v1m$#4Jč/[!g)`[lQYlDo(س"Bi4#VUBGDžF+޹YţAbu[D蕰FQOPT_Nu*Z4]eDX_(<9gTBBmLweS^CQ=8nw!xtD]F~AJ/7C(-x^iO}ZQ kY`$@p Yt8XCcW'$I-25+ǘTGsٿ_/$3W3b?c1M&L%T{hv4HDLDey+vj:[Qy(QO`90y;ڡOMhPqI02Q4 v?6Qʃc]{oF*\~? "@nw\|(%g&wjJ)"))-쪮bu:p3bkT4CVd) NJf\i)g/ވ)/5~RҐBGၣ uzT}M>i HLónʟi o-3Wɗ i(b&{(aݥ rjYKDG+WjK-+hoOWl(t\'m6 7ނpd_u+ٍfcgy$`s$>ƒaH ŅGC_5joJY3걇+>oDB42TZc‰#`c5UHGJ-Tf۬cikqZeY[2[BV>gciˊQ* /n@킺$\*f#5 4fȊT#iS*&Ƃ#2]Ab˕a/y&{xtf,)^l%][O<( \>P2ش-}B^*DJS;7DR+FЀ}l`B_&1,!$_0xB9P ^lh/>cnhG;Mg`-AR"RõJ0{;:qқoR ]1H! .vxzn-$4a4wDj}7ۭ7)dE~A:]Fd!FD/4Hl>3ه4N4΢y7w:~: 8tGRij_ LrdHp/'1ԶE vN%xW}qRQ"OXӤp>np}E۹imElKN7rSoH~X=ilj ,E{dt*s*SymVs,ʲK v tOCrsЌi ++wD֚^!hEOaT J0A|S9ΤmmpYeAkʺC!Gq K#'npSl8Iۘ]\ #{Cه*b7c\΄3&s nw~ &ٱA#B#LV2}/GM N5{k|ԯFǺXXBZ׍.׳8'{_aZn^Wtŗ>1,SPjCZX"42ǍWQA .׿Tft>s~;y+/_zZK0(ٴ2PIv]p OֹwG鶅6 1 dBV2;r ΙKYQqHFMthmG#d4LP43;t%EBlU8Ϣh[nIƷ| CŢJ2jOɯU -4z;Ky=bz;u[?9{RSPigX 8pU?.I臋R}T:} nʏGݰX{BL!;n<Qx41iiU7֤B{LfR#JmC,0 ;_꽷Xj,O&!y\!^oN:` ЀjZG?G ۸g#)lƧQ S3ME&Aut("v`zا* ѸiGܙF4t甴~(&@;Neey;W]kgQR{4<@k,5!sFx@G/+CO`?vfQL7r[.mpס-)VVQ~:cePL'I}ZfpWj>uڏijx"%Ǐ+ @h}mV0(p;l "@`"lUSmm]T4~`z&o)519AizN4W$kƽ1r#T@O3aԎ7KlGKO<;4FC4/XwTg$st+NcZcZZ›Zdp6In~R0[+q6&ŚXkNKYE!hs!tg".q[E3gomߍO*1pg.vihZ=ȽG镶K4 Xf%mBʡt:Z't ~?wO|ݻ7gtiW'^oRA,\.>GgxiYȜN*Cδc6I)@Vd11c 1bs#2AwOOٯ7 TL̗_`~xZ)-Eھj{nŵ_ ~4DeeQz6ϻ/w%rIk[:.xOx}7L~G9)% kn 6FgT,v/'11ƖP3)IX2 3q kXby6 m ~L`QyO.3;48w%QSaYN3k@a0 wzlxWH"$c#8 lU'MfK$h GYR2Nb/J_RESP[pRZ mNn3/N `aD13cxc)ѷCbUt{-!3R\EN>)h#%~;|~?]FEeԻ* tBe ; dh(S6*DݽxIGGuX˶3Fli79n+BۏN"?c  \zsAI Iޙ ‰wV:/áG$)MsxքTIv 1$\xDSd82iiKsUFYEÊPL5ǩǹl`PmD;fVLCE|smJÆAGVM:Acd{m8t$QiKjwWފB?x!@N7's%R)ck?/ >kϾ&TU VnWָy8_k^+-ܢ'w4 ]%+!. ^"]{kvۮm`PSݱ k4!FV4ԺX8Kb㠌DhN8V%T0._h^vKQw-SDM:VժF\>d?Ć2[@O}1b:Jb8P[5R0QbX !TQH{45JQ"[ie p;Mx/ ոTU:@:3DG`0fN䒿Dz#eOIJ\d_ 60 60栟ԧx~k5Ӹ0ipMsD \.t/_;th\sCWl=4$)a2͆td071?ywWgfwo`/fV G3XRrK΂}mR0y.66Y?ܯozO+8rWxrByl'~JFD90`YBQ椊q8keHn++>xdT̉``Im2-HGąjWunyZtA?YǻZ;ڷ< o]2~LvN#x*i Hv {M}tQ=*ӵGe.2C O33VC0v`I_q:Xq;cع^T; ƘqU >|rqJϨAWW,XZ1#Zo~ qʨ{uPݒtj!M}OnP2Gr**@ҋ *b@(^e4o./ ? \޹?ѷslx]bJTs_(Ln%e]Qg ?7r\tymg>Q>Kis{\Lbu/C&eݬeڜ^bDkCT#uMhI,ڊ.llkeXg ab hXsQQ`D e7?q6dKu^fki SDOJt[kIVY"uQ_F]Ue zԂH!V e -h eѼ2蟔|blD |Ѹ mtS:!V4yBWçA-+hYFˊ6ZVtG0P2i:dtsʐ\45Zd}˪- BҺ 0>J).aٍ.e|X-[\\LEC>l{02q/ZOa,]~Ep큻]]^hZF[I>=ہKBf")G 4n^W*[c.=)eb9 kYxhq/Ej U\X1g+.kmX.kkq] ' fGwΡbIT^JaBJ2*,[$!wN9!Tȑ w3 J첫J1 (N / 6ɒN(v` Zc! ]}Nh!68ʦWJU|+Ueȧl{ęj( ;o8E T"r.Swz \2!t־;*radVg.vjUu$sRUf9/׫_~DE( QYV&7m {@:W-OPxKVk;nJzd߿^>ut[ȭm Τ[%n8R$Urdm <1ֹ-́Ue2YbƵEʝ){FSVQ[C܏7?} ;x= Zбb~`=' F]}GSg/M0n0A֧Ŕ*v>c ۱&}x;9#{MȦMh #[!|hUFAAK֨|̲![Di7BeZi]}Ovkjb[ e蒋JVI deE9Z *۬w?}:5EI̦7S_"w]>@]; QƻNmH [MD kV2Uz#ԢJjՔ!Mt?K65623.to3i"kB Jy2ɜf ?[]ϽĽN `nF(wZc#/\ qEdvC(NxJF UD;0"3&ު㣗\ٞ:l߄}>V_fEVE4i++)IZUHWic +"n~[2wWr=C޽mXc ͻO\~Shi\ںoN`rY猬̞q+m^x( a6Km2Hes5B`|bM$ɪb-F>CLgBKǪqk@z]鰠bR(N>k02a7HR&&d <2G&^u?ht gE}gO}x@# !-O6nE:iVm5h,j8E"S}.I= I(VL,6>Mr3Km%-ҁm[&"=@6!7u2Rb-]=/rj +6Zbmkmg!<}:}v|G.lRYN#?\x/˥%EI-dT|E0$L՘i Z.#ug".[ |F/ˎ\AB<57}8)_M&V{ܭBqIT,Ect>)&V\/Iw`(РPG?4=gi,gEb$h Qm-_adr_ճTvpm87;~rZ+at<2 -Z65B2KFk}+wK'_!r~6_Ɂ3oy y+0SQ]6Ғt5T0j7Ͷ mV]+U^e"bX]-wEk5/hj ur;~ʤ[ks=ʤ?MC5(%< K8a4ҍe-@ӪwNU7@^x啅ܒ[A6(G"Dx.HWaLUS$e.9bPh!Xmj#03⥂mOH)RV:s{&CE!ty)-:ߧʆDP0 v(~st cG˂hϜ<4KAvTyu/vlמmgz2NҰ~4,f#JiXwcs#g, iX4TOݣ%ƌwP-"Ak^2b[ jYt+z id-~꾔5黆M2YaΗwζiӆq̯;z4H1? ϟ]E4>MϋŔ_bvyv z;̿9*2??Gobq174~Z}eF/!iLI;s*|QgrҘM擫3';Cgǎa*SE eqUB.Jқo+hT\+ȋ'tfm=`:MNv `aSOXm)Z} <Xѹ84ͽEh)+G~|T#jSl}3 _N[5ĉy&ARH"TܖmP*jE+W'.k]wHh"[bjKSͭT{ZP λ}wu{Ug&i+3(߉(y"heMH{٤8fq+)*1iBCR6.6=K6ϋ_ZslﵘJJQj2㭴 -uWڭ?ʟY6ik]:pSNEg ?ߎjU5؞+9a޵q$"mmU<$X d9W[%+o5E#^RdkpKpÆo7?)BJF @B[XYVx\*%Ule%1Z{QK̡h,_?Xxs1F{S~wv7}JkMa&2dbMI L*Cʱ(mR)Ԗjhɇ cU#OԖ$[]f]WHJ:2 j,%FJ:izb1K*$gBbmE4S>EgcGH;*"Hқ?^ӕȡ(~i2H<3 [`I:9F6 w]"sVhkj_sc"JS0JDDާȋfDy! yчCD -[s{dO4Uu, ǐ4% 3#ˊQCaSMdԵg QLTِ֨e˾!ՆPW_<<>olOf{5f{}8=*idOQmThrWtGebӁ%ؘ}sV.!t6ڊ6]BtK!y {؛׼e)jф\Ю%$Ί*؇*ۖ*S)TXj2<&Fe/ u1DKaiӽS  "$I=N~)ގn+127d@i y1Y$߬$;u"+q:'i.yN/3rCBlI @>h^l5?1N!jiaɓ'$6#l&iP4W`Ꮣُ, x؟scߥY&ϘѺ,5/\/[=?tM$:֪nWLJhM0({HXzׇW?@5oHx2^>Kj~[)fK<5FR!]UB6z_yIkr;zuP-(/7ų'\Uj@! r0{A$4Jkke7RR!0|NvX*&:Q OEP-𕶰X!e)MC,S^"SeI`6I+r8l+ZUԚ%JIltV͌MufMlc8dH9qr^u)׬DO#!T CEϳ< C\~".k/Zdwͫ8i( O{ ˣozzۃ_nvC[5}Y.g,fn )y?G[%Lg!'az0K+yI_zYUgs*>V?ChOΛK+ 9_Y`vY2-^9-nYÎFXYղJ²)+(2V)Yj28eX_AwCzvۋ]ؽixKB^5f->}&&홻fqu5-Jh|SA [rkOx[3 9Ky)ѷb+CFƀ 3[<8ClmoVI樥N'sAu}1Y9O'ːNF8q.܋n sg֯[iJ֜t{:3kntߏͩ˟ͅY88>$OlJ=>OMi}74ol l!Նf"dsIvKF[*ԲtBxo'r4oߒ֎IrcPn胖gumꔟհcR|72,lP}:u9CVw\')cڬթct܈H_"eԘz#iQҸ:͑@tD8cܘh H񊁡^y;iF]y!>)\S SHw|7+An0Dx4@XRrVY\vB'\AFp_Yv}&9+4#@,ߜF?o #+W2Xm10D:qRL;n49TiR9R9MѴ Moܴ#PRk2,H"DZ~WJ-a4^i{>}$Amִ=EXh"SC+W]siLMI~4Ayүid-MI_L8pSD1듐P|D8iF*]LKgD:5u0Jq1w8=%}NrWVqzُD1Kߕ%XoHlXL=.N˓D)u;2 R TS1buE;8( -BlChP_;!^&o);_l=L"q>$4nqS3Fy_y Eo&x4FĹQy{8 d*6˭QV4FPN2d &~͊OPכ$>='kga-&P|U3t/^ֹNJN,%#I,f"o9hbޛTKґ/`-Te[HtBY1 8|Fš]638tSvzoje#G47hnua!lbnlx`w'7'\j|s̳!Xl昪-ŐcIW8|=ݬ]\!,giRxɾ>i7%")mɻAI}H7Gv6'cN\՞Č9 oR$'^. dC5VɲDeR:k˸`$r F BClOWosWOCl}/Sx%gT؈^XۨEI Mdkqm`+='N/Xl 5cJlj+Ƙb8crVYf777$R~ѽ26OZn>1󽳘VllҁU{ 9-H ykZ2l}!'@>HC_m *Swem IgoIvDIkf؟ g!^v&{[,,\AK 2cTz{":ALA5.FOeKH09>\,}o}Q f!T } fڣ9m<> 6m6ు%%7Kgr([KhtuN(5`$ZU'M̥Zbכr%:Z]J IT(m $0(ۛ3ys~b y!/ckY`*ơ+"7ƐfF7 EV^԰UH&GlD!c35:5/+M q"5kP6V"W l)]Qf [\mvd-E _b$%3ݠbdYGU}CILʍa2V)kæS6AV7Ky3/ݫQ󋞧@iiv} $ؽ%mXn05b7wTiw`iKx(n~럼>YaU }/g@HN`n?[ {fGg0 67rs]RI)y-ak fl mt˒ajۺ_ h4SLlR5)|E˯ŒhR(]±KރF9 =sMf^9iD[v6I<_#~|7M#)w|鲄'NNfS#C^ߊ,)?[y9YIA^u2G  u7=a} ׷%mK_p&I.wvm-\oH^$ݣmTs+>A?s:9k`)s:o PZCz5Vg0X>H2 AU"׼iY$ 4g ? yoQxz$"zs x29/</)b1qw&9GJix>ݴ~h!c%w-6gisؿgjS;WUQ|;$%gfwfv{>1&9]WͷWJԘ1IB;yfȱ~f#IZU|jvN]Sv|1}y}*ZpR@QwGci%r2@ڪu @j!]] 7^|->&6gx%<Ih I1ՇD@o䄙lν︴za8ܿ=: л_vmk3[lK0sղ -^sd.߽ODX$86ji| R?fEHz(1$Jf n3a$^U COчmȊbcF)@2Sio+WqRZJf,s|[E ,xl듹\A90ӬqN=MGVAF`>4 *sis  Oh8)3G-ge#:n&P;*8zbc-9 B ;P <2 7(tϩ [n/p'$@uuTPBqa6-ln#31R16! &ZDV#mzgW]y갎U$L{LW. ^&_~r$(o;ǎP:غ{--y"3<(5tWSG9*,FLdmDܱcp! ({DZJ

X TEH3x.X?*7b+8t>5C{mVs0pdc -{>eaD8|M<R$%ǓnJٙTry@ɗN֬~Rk;rS:RGAd t5Q@OHWݔC<wuw?#| >$nEeH$hMntk;;#VC`Jϥx&ǧ a133xn&=t#VJW=<"xxB4Vnո DB[Vϣotq쫿VUq SQ̞T„2;&QӬ#-L(H98tOzEw*0~)W~Z׵y6sƥjY9{rت^T[}$geoCwm;ihu 9xªΧM*CyW ?}R jrT(S ,FV40\{=-% Եp # Sb:Yc+=gRfXIZ-O3N ŏ^lmˊp{V(dEM;~U j*Uz?/KH˵}ߣ~?^'p`}lYc.o)og^uO_p?e{;]}&bn a/s2o~ 3G)bOo78za8,|qy@Hhb[H~v2Pʛk^L <G@-{YV6E=$ DsL9)EFJY5/Ǣ4@e>G-ȫ!"y&kFc tγ:GTqǘ֏1\bi5utЕRȏn{{.q`Lϋg3 G:vO$SX5}w-΋g[d.-ƝG#*ع`&~5+xJ,w\NxzQ7'vFgHag+X3&;ӿ.=́f}L- ~B,tW.;Q`uYV̌+n>\1F`?<(mxOXZ 5x1{l_Bih4vWwog6A[‡LEhNwm^`ةm农mwr jk5>sۛ7o=xՓ_~]ޅrR/;/N4St j}&pЦǓYS~_۝r,9[#8[ȷS9~WD m?XkrOQ4p^VxM̷|% ]$OZwwp[1I+Gbj`HFe`sc֒uVOa?ԎP[02VY)qEoN%Z+@s5sTUҷRȎ^oxhж6W3r+)hZԌSspY7Ư>c L5\eőivƋ nw}Bs2*^b` ou3Y 7Luj'zM[J k3v17SnlJj B-[7S 7dv2kvlDځc Zx6L>ҠL 3vI㍘*4ƙnTDO*S9]DU j\5|:GTbc7jb[$ҹcgg0_gz&h} 'tU2٘1V(&obdmAkC&S6>W- 9J<ɴJIT"W zz*R1cyER軤sTCQ[ , CP*U1R{jӛXV^Z!biF$<` W9;J1v ^)Ɩt+R;JY(㪣 :jl~ATA_ 1jt ZU01IFjJda^p/Cd3RݗZk&܅j8V*a)@n@?ĘV2"/>}6qBXnU<9z' OT$HXS[%Xe5z  Rb u"UTcsâDd -X < -)gb(0b?,6-"b6B":Y<*S8X1֔U)%G+ `,e}),]:Ժt5 Nf;{8k3ĥrdTӚME[iSB̥#NVoW 3VVױ C֞c9WceO9I6)EmFdBi0Wl\eQB:;`0tfl0EpvĖlhY$!& | O)jIR7 ?=?d,#rR-[JrE)I);h:_ QhRT*:\!A\gmJxjwL%(}H%+[6AՆTXѮi,3XD #(EWn"|L]*4!=VF8=lsH[I0|=!Iw[xm7+n⍷fI!σ&PsH$e8iI>tsfys6ݜ#F!9$9ߌ}NMH!'FDs̙=ǯ4 ^]@yۑn:MAf__W}fʎMх@tKQ\v'^Iw]R+6w#QSj;Y/ڳacotϗџ`;k5>e*;6|"% |ci֐ 5䷮!e]D iN+>L!s}l=g_D_+Syn:<:k^W~-5Fѳ&sdַG+W N|6p@_L%mN9sG2fNN)f) <-/͉|gyſY}H-&y$>uf Ѭ|+6PהNqqѻpvܥOATn8$&&RAQUQݣOPWU%N#ObZnƬz3`u8Xy4w7o3Ȉf}4ԪёPv҆^VI5H^辝J[e_NV=ί]0FUY_0<|p>mҀ\XC^=VgH}Ű7"9}Psxcr'U"lª@1IawcI_˂O͕hq }kZ4`)^ D𗮅7E8[m!C߮/RmǸ\ZrV'G-Ï߼]?o|/l p$/e^O[һ{V+ *>/zgGgxn|\Yn,%l-ɗ~+C=! bz[ l#3Zmm1[˞8q3,9L6p'<&FթD.+힪f<?khe pt#-9yAyM~4~lbn͸6[Z;_4s,sؿyg=*[U-K^SWkbHȴf̰Y8\YrA ˥G;cՏbk`j plpݱukɁgR -B~!*vsB*hmαѭ[n>\]{vX0w?0Me!9tt,f1F*aJTɦq pRYGe"xFcmH}h,Uvv=8`rq3KWkvQX\b, d:RXXAq`c*ZQea垂<7Sj7,,'p"|[[}^E-w avJa?Gۖ(X 2ACJL-Skj5u11Dbb&z+/i /J wW ln<Օ*[Q|]; +؜DCsED h$% ("*kJZU$7dPn \2P-VZt&ڛVy 7guXmZ|n]O-{7:I\[;خZ葭6/gkIyqoEvڹfVZkM˕@)+GTK wَ?`'"f 5Mb uZ 4:TK栭i$ e`m, Yh6pMqsua#$˰{;]Jnd~tڼ+l1䆲^Q[K]'?N7- tM2/_-oOf y3e𝼙 .UaҔӨyd2?{ƕdO;h>@.3@d$ik,K )I~֥^-I!ڤXRoUSu;S64TGMbX]h|t:Z!Mb9ˌvez!0'{6d ,\fx6_Ћ<:M(:u2\(AQ.e[~&rlYW7o.ƽԣi#>m`t}jVLLo%L,c`),!cphe)} y= WX#ȫ fr.ip _}LOz|W,yVw=zޮJAxJfa>({Nأ J6mlΞvp^Kڌ2nی䷁:V Mu9 zjqQ)1i 1 R!~$Ikw$&g-4%S}zs˿G=r0}se.=~xRmt{t&Ct ѿ_P\4EN\1I>ŻY^vzwqQgt^ ./񻤞#ŹldF3L)皮k+ E|)8ld69F-i=ˎ6ogsUa Jbqs9suw[H|CrAW¼<9 }x7. }xHYFApIm4%Ji6r)hJ&v٭O쒮aC1Зgˉ5f=9dga.' :FyXvIT(ᐨAbm Ev /08Su FT j`?s:Q;vHFN!Cv+v Q$F͇Wֽk:Qݸc$FI7./vNv<,.[yS*$gUm>UL/ը Lýt"E{ ?|L"Vm)UT$(.Ţ}뺺6\_udtW<=1Oi"3`l8)4oyJ) *sUt!9<>qSZ}I+JjF0nJ%< s7jgnwj瓸2<GC->Qݠv0v(1\ћQa| )wjjD피OHrtzsv /^tc ? n y,V],\ZuBAmʀ]pZb |]q+j.N&ա+H5ԁIp؉^rf *П`uW{6\w;uW0UeE&Eotn1Ed^Qa>zu- }9SSBG$MԑGS0n4P):ܣ-8ln7r?$jGq*$cP"ZΘS9S5?X>y<B}4fgn) F~G˛Q~u\^MW=Xܜdv9]S/k+y{-c_:9cO9T -'! Z 79:wjF]#>lYFCA_Jp6y@r&㸼h;G97;ߙZk{ yGz3=` {ݝuf]L1}.nrÈz׏VUv}1U||y>h.s{[v~YOSr|UIbW]:M rexqǺ`]*$ -) X5*jHR5-g%!'B O}aKo\ɄUWkX[.OF4[P Y cL3f1/ܵ``S*II_:V( }qq3UOq9:"bS#>`i!z,Sm b~hFf7G?qqkISNE]XGAoM_Gmw3 Ol6{uYG{z9yD8u%*BTTa-Tm=S2Bl]Ykk|^њiViZ5MTҵM.@T_tM*'jBLI0Ӛf)FH`-w}Wu *@,ͳzԯ7b{!1"Y̋I濛Rp5~Q{cwkTظ%MG' w̓ҷ>b)P6mR֮k VAbJ+Uw=+\rVR/emPn6(Yި Jv dmP֔a]t:Hl, h9J>"YG:T1!E:@W )Eļ8yYrzDoXۥv$Mԁ: sI $d%ިQ̓SR`n'GaNtq0pF@_/ dQTHyF ;bB?yĻS6~o6BEC#@% ,@ntP /+:T#Aov-נ㓮*[[d2)S}Pפu"} }mjlZ\VێN*DJi{ * ͩN#oNS"`~R3`N9Bog]^osJ9\k?\]6e\[ŢTQ슪k"yl.2WtƶmTEJwcXRsע.䪽|̪Q|.]gaHw/!Ыiܱ2? HY@j!-l(\)=>({ɟ!5wA3G8jRQ4J>ʖ[0X Ϯm<ړ;q*7O 8}GXf(|W. r-Wq(kZʡ+䨘seKc\y h[wq\VNK=&CjGaK)g~yUnKekC[dYKgmr`"0R"#djBg&0%0jJĕ5./f\P@M{z}*nuK_bJ'mViwM#&KY9z#f-xbJPtYW:RSgMii·/E.yt Èڍ AmdF(FSU*leyV|}";VAu~עYqϣ*uJRh=s)!e\ }v`@i"J_$ҖCd榬>gj귏$#'xU*PTqP"^`W¬¬O)|P@Y-:o\zS^v~T@;U9miQ;:>XNey:5ίOg`KuمʮPe@NN?'__]]\O^v}Pg㫪ca# mJmUuTSm+а`SWI{WMMYR'6]mkǐ_Iٻ߶/3} }%m5K-I ~lS%dInJ;;3A%ޤ,^ꁡ!y>!}RCXJ4v+sjƄ~ٚ+ӵ"ٗO|-lM[ 27 DԊwEiG'k ,rG"ޫQ*>2|RAm ;ZaD eDJT6OxVR_Qcϕeڗ_ڛZ1j3vY%wȋW6wɮvZ챃] V-R֒]O ȷx10H}T@ߛOoVkP}gog8C?XڔSΐ,762aM :<٥RZJX~o%v藏=]#F7EE_Hpz2COѦ' 1Rf*d(` bLif kśi.#!TQ I8:n<1j!|:qr S[sVz}5ՃR燑ْ߿Uo<|V`Z` ~^^~܀vOx2Ώ zI~!Į=7325lz7VJ 眐H#5 >ϭƜ1yJƆrTF)c@XB0ZC0vd#QP ?h6l /45ڦUFڌT&}7rl\ӊ92ȎEdCVZdEvYa) ϘBgr>"NÉ2k!6ڭK;Cs.a~+!KgX8)*b:QS7Ic%Xo*46T@Kt\b݄Xs(.]J1_֧@6^VofKR 8q$@ۮޗ':6B4-z @Q ލS5n<1R* `kjP/_&1ՖϙZ,ʛV_ӽx fߡA%LH=obP|.$8ANlu`X"D)=]&&OQ:]KQ+Ծ|#q]d#)L2NtiO}X ?7-3hȨqmZ59,SEن ru.EsC5&fֱCQf' YKSZrF(Z90J=zD(z PA# DEAJBE#mof{v`mчdy!Q4ӵV4O Q.w|%X'2RnGd\C'c.mȘ31J>ݺ؄bQΡPf8'CA34XT ΅ Dah|Fz0aJW h.<jTiRkD!6BvN Sŀa$5I.B"ȡIA2՗STEgTLQnsjlTDjUf"^;,_Jeo9+.j^2uVY*,KݚH]H@JU+k/V)ۭR&U)uZ V)Iז hmܭzZ€./@#c | Fz^I)A懩(Nd $Ć HգoJ=j/uBvv'c1bAȀը#ЮQ)V#;rhlJs@5gvv ~A!S1`@i]v57.BvKl4|SCC%njͷu} }H|f~=؄R.Ԁ#jCSB<6s@w  L. 2 #samO8 ;ot/B-2_3F||,ËG:ZMbA-%3{O{_ n֮R^&k)3cTCjI`I`Jꔣ"xw6aonGi8* 6]٫v?wyy\\L_wUpƵjtsO">4Ԓ9FվVMWq7,pw3[6~PNĀ޽  o56U!P5e"`gi}*Rou-#er[dJltsY4 /ǠN ul_&6 g{۱i3Yoł* 1UXW eb-En.( +Ō D5L)8Xe R>SS6M0I8qe:t +pQ2.\WoF,1~yܭ7og!WR+9 V|y `$\p`^}?*|Wb)M6%6Ϧs1ƓH \Y2) #42[ N*TuXA= SOI{V<`UT4 |Iy "rz_.K:ڇ u(^'r!WUo?}Ot<^\_1Z;~盻u ybA5׽OTi5,Z*ml{yMj9 <%|(GujO@=2Ci!q?.k?L?ej{8_!pwi__ʀ>q{ց=#1HF8RҐI#(a#HBefe2Au>;G¸#!͑K=H]èr\CLŤ(e}$T(m1_ !HOn!HDlYZ&`J(+2!̈́ߤF|RR#ZwH҈ZC7gB ЧR>4;y)Ԏ!nZr@ge5fjw ;ڑCvyV™6̓\K#2 +YBHęRAn_ej׆oBa'ԎhAeF̥ se,djЧCP;R6a8IE.nGjK/SLZM .2sjGjp]<m<Befg{0;CiSfv5 HY;& cK;O2̮wfGjD6ȳ2y2,Mfv/ّ:˧zPNT)s8hs.3暤{,X!1;I#hAaq>!.Sۡρ5<{)ԎnI[ݶ6,ܡ,3rS;GDhJh<6SLn>$"׳$ԳTY^,d(s='n\/%<"uXwnQ4MV5Urto:utўGGF+!EGA]+Rjs#}pĵ#3)Vނ9MRk }h>Do򞆗I2< ji 6 7 ^ϙu AB_oQiFsx\O` Pw킆 qrXA0(@lJwZYwÔhop|=.#/=) Ve, Y QB^sa G(AX}Dh5I cN!AzM ӏcq]ls=>sQ}x~kud:}ÆUq2 1S3߿q8X.CynCol>nkCy150إ}0ػ}d(wOs{^SPLiKj>ۈF"5dYS=&4qP K.l [θ}^Xȥ^ g> sn9[S)̋ڵ㛖VS?q5eg+w^䎗d!Zy6_~eq|>caa򚁯,C-APFWZָ%xIp­mcR*$NXZ 3}bܸ+>6ʹ[1V'.>_@ꦢ[r27"Vp.jjrU.SX-rB"VRaj EJ܄9gQrĚ#V|LT#(#5,C]4ŮjoYu'81ܩs\$ϝn$K.4xRNW2#4aYHޖ࣌ٓ?XꩳK={h(44ȞE1[2l{_AD+|hEa*Qe5eYI@[H>*Mr,"B,@h(bh|6)lJѠya+*E$qE): `z,JNjKCD!JE!OSTۧ6eXwrFUP(Ya O)EΣ%A`*с-c Aw ,tV18S'?v}qkc lԆ&VV~ +'S,zY_#^H~}uƝ³8 P_Ňa|phVT㓿#FTu>-eQ3Les]ax,w.T9\ nյE]bHQQ쌜͢M G:!' YXK-LAŕPŕڡ-iiq0ԥhݜGJy09%ŎJf\4uEѝR<lxk@nK;f74=iY7WS< ~J(hWUU! tY:#Vnǥb_fX]rC hղuCO]٪ű?ҌŨ{p#ݭTJ.fkޤGƣ/ue"20<'/ ,w-U|OWl:4 VwԱoW g-={uoQg~Q sy'ҞD"5"g'_-wd܉ls<ѹ>XS-|mU7N7jdO"X{krTL=am,<"s4:r`厷Qehl؍vrrIVgw-z2LyV<7K.APZ9ARB)U&j-q 1a?ϳKCZh4b RBkݱt^g9uc]$G |LMwN%XP P^JoiR$%hqhߜҺQw/Ou=iI,1\w :+mcJd1ՔT9Ї^w(p>iWG28SK4G zŲ%3.T@2.L~rݞxzċyB&LO蚋W}'N6{Կ{Xs* ΢&GԹb4/)wDh}@x~*txCX|SP(_Vo\]ĠONjuq$N|6+CGiڃ.ˆ^Rl H͚ޥBptQ1#>Bl[MSE? K~ͥgP4^ Wt*Ll7 9FZy%V\^#+ZV*y*Y`T/NIe,u&׼j0π#@?E)*}(F6*i xYDRNyt)DU*kD: CSR[MR"7|)YRRLp+%lMSeO Eb6'9Km D#bNYL6r k0i7B؟ m7'/0S̿ K2hLi`+GXiH<8c4ōNï|s~-W[JX=U {&5Z) 4b)m\r>d}ԉ&D; tGsH s4%4:a_B<׈z!PCvF,- R9Tvw@ɝ֝k/:l۰ZM7뻛#Sg*ڊo2)qSQeqtmL ]>]ȧjyVq>~IkKc`<=Lb}6V͓bK;AH?9>d\֌w^5x &i(=C6 f2wY]L,E"g4 r#ssR\IhI9BsJy&mH.uwbdcwsD% s> v̙#NRq\,eHJ nDvmfPY02| )ylrvh9ǠJUS#8AڣBW 6r"9;/o/dvr +62! n{?bD^8>eiKy`4*y74>7O<_32"j" lR?ǕQKW(i_M/&b8=Dk }\Ibufcο g?D1WUW4sUohZϯ{\6ߦ?j%h}5GU [pipW_Ш@dN-[ja[B" {˰wǏwo /Ɠk ʃYAGGGѠ'aoo>-J_p=|BH>KGnK["kU "9&ocR<$dQ1o)2yj]m| |)1d^ ȫK(`Iڡ 'OWu8pv`hfV%ӣqYk@S`@< @g(o淋n:E' ?;wv~ _ho{C܇4XvDpls!`Lh@B+aZgLX ]@3@R %$K:51 ^L咓) -C#XwYBnDg_CBuI{gomC8C%?KvShiAnS>U5CWh$F67=VPBv܍_跆e%EE^QQ-*JBЬh~`WA4AmʂsU'EGKwڤt#Rb%'hT-7 "{8\>r!yitس:>×PgG@çU0z>/qTk^F|UV6|3m+ehtaj x֐QvRu3 ?c|p`gEDB{P4xWD ;X έMi`/;wpi< Gуm\u[}c֌t@fMb(:9Y]k%+y7_$6ܤ NH z#qR%h`(n6iQHev&z6ff=cY?ͫ_Drr)V"6Ѩ>_Qw%"ʓw[%lm _ /gݣ"1 wzY^4ېs߯{?랉VY#_{؋vh}M -NaфE=JOW'0ȝiC8 +Vb1{0qĮK +dwB䅘1gB߿mQ)ӻoUُqsQQn$>F̝FL>^ _0X75 ^|}P()y+`gzNFE6$2L&ʶD \rk,,Ê%jgY% q̖Av\,zvY{{㡸|?$'|N)4oԚs!1'D -0&*u,injYAM q$7\˿ٹsΪvTy&W=m"oěۄŶ \5l "!zn~gVv:{vU(QM40k'n>iuMm 1Zc{*EmvZW.Iֻ5 B }([AXLj{q 5Rf <}AN2GԾuP=7;< F2Im“l"U&<&V7TGw{XqɃ^Gr5{fMx]A8rv8'u})_+ww"mT+1G„'kb Z1,W/fs*fH"sB2<LP.N'$(+#Xܡj"CG|9z_޾__ $fR#5zh<&wVxsK8TG~ܢ # EӜupn4 A a ľxy}G?/jefDwZe72#AP*uѥ"w9CRr閳W^;x΀ vr;ퟦ6;w{ռ_;$:]ڃ"|bݶSkK<\*.iی>n@vTKѡʣTk Ô_ۣBc=&zL@8ӽ9Eqqߌpb~mWuYCBz*/;N`lT"X AsNMiO.ʰe!:fN6^^}Y FVEh vϿ]_za]/H9ZY"HeY]FEJyjJĈ6E% maEȝbìu|Yvo㫟^Yt~~=x08t߾7g ͌#ʼnMXMHXVRi4d$aRUrTäv q)YTNM2mZd[ '*('sd ~(BVď$/`@U2_52*nF!3Ð2PUr8TZgEȭE.IҨ1zbPJ''Ӛ0j0t{^FEEUy8)Ff}HVǭ5a6;1jp, W4,%z&*-2D #KNkO!:RKmwQξL|FRFA8) $H x%!fO6=lȰ` xRᩧJrGJؚJ#")fjh̟ ,+|)Xp KPI[iLqׇducN:`dӇrlߜ7uO8zfR %tEXBa ]ԗPm.5a,ag -â 0ĎQm$ˁZ/Lm盶KNdm糷f 10Vps}VL^>̮wRoAu1BFUd9d>-X\Jg*$.TlI4|zat\1ihMVˠ!MSA-WR%qRƣ R4#EXiV:)^ HM[3 "&] H%XzV4pL͚Ih%z)>T׉w/ ԖYc&UdcH&ʤ…b%_є0{4u桤`vAc+U@ 2Ѥ2h2(DO*Fus4r]4:%-9bT?.^^a/)n9V$\d:|[D~73쫇\1#{je&z>\hVB)9bk?4 Or) MUYP,_;ÏogH$Ø$V㳻C HzԸPDH@I5&}$AX(?2kfW7K[ncgZŮQ>-q].`5WS>+Hk9WW|=g*ׄt:hHښ+rWf0ъqHJc.2ؿZC b @RSjU)='piI-?h.0JP큣sbiiK`,`Z]e|#(6=iJ6kx6׷UOK=3`fI5@}b[6ѨBW)(HqDP4AZT jwWڱӣKy}$*ߞjsKqH1멅qn!4(t{컝J9_By6a87!_HdAAT)Ua!<@n K#1/ŔU &hʰ/6xסsJ4; \'Ju=wLkZ2GLIRZ}:[\XFxK )~?O)H*/B9dQD*)5Db hSjXBj}֊PYwY]XI=bSf͋:+R g.έ ]X%_~gw{]s2B>Ϟjm la/w^N2xLW˚nev+=2}ի|gy*ǥ_DVE=!hh ZJڭ~=vB˃#EMg 6v&4VoYR-Wrywh)ڂ ݲ n} "[~Qv( {/<L)lĶrƜ1[3F--(ô3t2mQlnHl;ILlbpkb kwIzxĶii]|9xb[wض~zt)6OĶx4S'2ƓPm B U rs""  Lt`!a&鴷J{y{ zJ琢Chӗ!^N:qb[J~N[k:q ¶ FX*l67ahJ8 gsXmhUAc ͵q&KT(2)]wHL45 >kw̬;\k- Xga"Dg8iA=CtS<S^*IK/^zؠXp}ښ6e QbZ{Z%U1(dܘjku aaOS 7"2ͧtHj]@ԭ:%r2k'S]¸U-+̓N9"4=17}z / VCvSuNz!P*5fQ@@^7' m]|ˈr P*OnSlj&y͐˦f"kT/t ~tGm +R5y^8B\ Yyّ˒T;݀âeX [ZfJ[X-\p=npqxH[JgڲOW|q @CkcgH/տ4Ow EDyݘGs_/fbڨ7҅7Ϋ;wrBBwn7\ <&47PaRB&Or~IMcxd$9eb{F$`KSrVp<Dr]+{ciSŘVc=]qiQlny)`rz.c1߮9\6y6҈bp8ERLGSr4f@44t8A,)*nJ*qF=<^U4iI<ҬxiŹiE=|R5^ ct ӣKy}0N(Q#J-ӣkiL Q-٬eVOCU#"uӘPWiQc2GRw2 0bɤeCs9Fs3277E&5h:5F2%ӪTQO<į['@S5Lԇ(+̿#\[VOZXO/ wyF\LA90!@ι2Zx-a!Pɯ9`Au jwCr`bfMJ~LzPZ˫{@X$Uh(Dو+-yj1p,;$BǪ JD0A )4aBO=2]LjNRPڂ_C`q [-K 疎!QRGA6<ԍa&kUo6Ezdu.(Rr 2 7[Eu  X SE)sN)`c/xwD(DXz F%\r}GJ,_)yR#k oYZI}DPNP :+,ԉs%q%fXH$CRvk )qH 1/IS-r4@ 4XĀPYhBZ*$䅋hLIyк)ɺ`byPDtb(cZ yҔ4fݢ -jݪ.Q2\ʺEZ?kJ&gQIZ{zR)feH%1K S&(y5\^oO5fLM% 8* 3VsqG0p['^tn%dSN]qעփӗĺ׿C_E4F_ImvA[cמwoYhB w@B^ȔĨuHucg6"(3b1jpg6;H =eJBaEdqb|,c1bIݏ_ Gz$0Ǥœ+srYyZU"dC $*0ڏuE}Y'S000].0Őh.dȗJDVhVc*>$}ʟ DA:v`rD\niCJSkւUAɵUBmP%$1 z`id++Q$训CURGs jP$\5螓SK[e#V]C.57g(HD0E:)[Cw[$!RQ~2 "ҪvIZk+~ҡD9Iu$sGƸ+xJ/s `[-ĖG1:#[=6gb#B>+t]p»d8nlfec7FXaB8s0D_KA$`{w (E@{C0-VC$BӰ\ &GbL2dry;C^1}* :󭮄TD\n=%4%-Xc0긒$q${đ'Zc)NhMewNeIa1Yd#Gi(4xB,;ud"Qvž wׄly댽B/W|rgݮ ɺk s7A-bcDLGS~%%4{%h7+?}ϣX^;7J z ## 3c0upv" #FSP< Xbjy6\XVMAbߦc֪^OZե&'AO xsn,V->b`J#,jJj(ܨ9,Sijs4G>ȯ=P>x8L/y +{!F4uqđ:L`!%H)^:_΃8C{ ?]rD3~JY$6EYSuu- >YJ+R3ՌTaj޸.oo"Xz;N^M,5b|"1&TL Dx؁Hic"NA%\RK0)p,5y(ko8frn=9`p,]o}} p(QxJ#iI\oG{,\2Bpʱ_ H\M|6d?)GAIXaqa| Kv1x ʄc[}}=}B&; u"T1˛Bj]S 6_lO6ta9IͻmKѓI:LZ;5.UPa"02DQmѧ0ݱ@XN&,;P=Q>P j< cCUsurTJ_BngiPK!0 (#QH};f9a4S;BJ%Uu2z%ςfmaFݱKyr5!iÚ"d9#?B) fp3 TX1ILl2ChXE}cHOk ]v8{5[|)n_ywPsN˹U*+Q?fN+> ާì9,nm *ֈ>Y$DHe&t9b%ԅzcL>=Щq@%Du]wNBҨ9̭TT j e{sߛzBlѺ9|V7}7>U&1mZ?_+Wg!IYM58¨&ʬxVajAd8jG; YlnwD{oXy@~ 0HZ;2~Gٹj7-gkj)8PzLP/`h^b뤠{RVmfzw:w#"#KQ !֞^įTSV8#HJY?MMr”?,&+) jq`-u'w_vW|*' 4 Ƃ1|/ X˹I(?\V|@#2RtD^ݞtuP|2T5_m~ÿϺ}Vf4GXi06LFxꡇ?Nc?7|f 4~p-4u{23{=2>MEPS^_߾[,GLܵ ˹KGۂПF|~,^_;{[.;7V Hݚ$Ờv BGpm g\^XIO( M(O)*\^ĐPёbdCB1Iz]-zpˠ [7&^qaIAZ?'tD)ω#$XQJ^eX"(7[ܘ닌|[Lr |Lƀk֙27-o3rݛ)bξ5+(E1}JjNQCtHRJ-֞ a˼Omꥧ)NqFishmnIK8C2uXpn$9n;G 4"~]o@տ~Uj*~!TGn*o ٪έ 晙];eV+pK\+oe1 PˣL=6D'tpT-.Tg4sVͰ1٘.ĵS8 DN`9 p 0*)fK۫5f@ *znm7`n x _6? ~Y\‡kzQ/*N\>^㒎`Do\X9uuJTU2|~[!?AU6ȏo"?W]d*_ay_MmK7;LX%`~*M&ܑw ,y5FX,:fTrwvN燁ʏ\4A Kr$HT$ LN'( J`Da u.G*Q֌8Bd((hVaόYнySũЄ$2U R{NPuK}FB'cA[gE;8E~-]/Q;ymbu-߃wJ栮ԥaX?ywcEVaw-{{dc-GZ*׮䵰{K.Ǜ2D_/lq3QHaVsߝY|50}jR5k0ە[q5Fk膯5W=h\s#ooM돏b~|s} OFAsSUΰE+Lo3Mrkܕ(b'6x7xz^ۡCjbDS%bu8$,3֭+6C3 hv +nsG?Gd%,hI dLh̝1V=keO1vσj T;E.Ɨ<<Z(T ajԿTC$c XIFbeP>&ꬳPђ}MM4AsCv;(]딝;,u7r)V2TęN2H)NUw(|F,}t Wm<}I>:SL>/] 3nı݆yhti<|خΨ͟^TAu'Ga1Gl]qvjaCߔl_ݘHmh m4~1Om&Q]+N8̀Z٠*Pz(9joZ j* ) /G$TUX'I"(Fr)9CV#`$u0c )aAru>r_qo]Ogi`2 XrO,u1i%+Kt o!qdLRά^&& {'ls,ruHz4uq.3(CùFO5U VAVDéKAN-1j 󉢚#gIDY,|1;H-RcR|JjϟN}@cYe-c}@ Y@?k#HJ>2 @5.הv5E"$isu璈j\\)~f$][=b0FթZM赺muPRq*M]k=k R=1Ο5̓`Ͷ0Y] |>2 GiclIvGwp;s00W`&),4};ND$.br 1'80~k/6{;[VͲle6^[2O+F'ٻ6+WPkk+o S5U-]}_t`շFkFTL&)\jϸp;AS͠UY,OwSuд<&$pmEwǤ(t0%`@HKcTHćlwpJNs))6hdfE [.܀ԫ*d->v2e\r{Š0Mg*ɀWQ\]hܾ #ZSI k1_ZHzU[7R=xVKa W_U 7_Gnbu4Wlat\  {Sl!zκ5ƯTVray?¡04"Pe;x`qU?:JjVUɵb8q@!⽊#{UҚi擻ypR yyh~>nUOj६"y[ >ZՋ{-=ǚ`7/%gO?m9^'m19<ixG`l~tW/!/GS" DSlBS쉦@TiDc4K_V=_鼪J\Sk+Cڏn~tt*^ 9Ji7?ZzyYH"JwY~VYDrHV?ItW-o9hf Js bR2(ΕΜrM H4j,ѣ1˭>IC1YuY%?*!LDVs{G[ش $.EpO PJe+C [gB 4m`x4 4ubeM^c?rkAa⭁QlniYߏӴ~.J) tDN7͂{wxi7]Oa(G!gDᔦ:RVV0MÅ~cW]씖g&}?EH|gЌ꽉hƲN`'RG7w2I_]8&,mt\Ny\lweHZ:tly*d5ⴁlnYz2"8e8&4w>yB֕UyIyFŅ$j]uv;tzխaZ,Oo-iy*^Fløen[MmwϓjӖAwӏv(C3iCga^؉ΪC]PڧTOa.{EجVbM=` sZZ՚.rzmuVx~^Stzf ]tُt#^rrIE۬mbE}+c7>9O7;PiWWQ"oj^M<)r_gE :9-{Y{q_BA紴:]_[?D~DЩ9IN/P 鐶o5B=&ly!r܅_RT ~zM* .;:lНo/w{`) Gu]׭#YtѭIzש̭glv=mEj3n/֊W<>VђhNMUR'*FMݭ]m9i-h?~Vђ ε3U4oȒ}P{¡qQ^pќCdHKќ9IKќ39I^R49ƪKќ6E? A/qnК^"nGb*]"1$\R[T6{J㼹۸ JDqgcq|xdSc85}^~C+w&]!R,` cq4m\U޴CB!88IPd 9B̹D* kWlI%1^ɼ!W>,Wa_LڙMV^LگYb//ztݤY/K%VNL4 8y<nt_|)&ߛLǥGY[!0_mM@%Hx ^~zPo{_}[B3| #7=Cq>iCJI7K+^vSBL'tpDsbj3=%[S~*j}} 2w/L}" $Sq+Wq;,} I5OMI[ؠM+V'{àJ{UB͂dž LZѱtD;{5ռwى-2EZ8i*"3D쌁 F" sTHESc#r (M3|cL[NWGLvbk(~>;A9UatsAbg9 Z3b4ڊD("y2J.J3tu8Z΃%lsLx 2x%M3L5${ٝrƃ֗SM'Є?ߗ (V?LM}Ulfĕ%;߽ݻw/ʧAfizn!=0:c!.j5GrQ}-?nOoiV̔+f8qWoR߮ʯl#2~x-ׯUFތaO^R:-?ݹOeJԏRF0 8$# 2S"! j+Hhdqoe V9fI@fxyAlKE)%$kޛi7n^_|Z/΁~ɍ\X ԥ0X9d)h.l2ysAxm^NZ󘉈 ALH˴gctLR \$C9:Ds?v[R\Rf6e<[18F t!EgMYH|dxF%#(#=el+%3{ OlzR9JgOT.xfgSYdJ.2Bu(O[(:9ereiYAcrJ,1DfIAAbl´' gs ޭ#"XdWR6jER8OUMJ9v ;D:Ŕ4)Ff1j #!ۈ!ėj@IJXHN2aAZpN@ԊZ]R2!u9"E)BGr p.(>xg) I_u* x@HtBg$n Mk)>0-)/oQg}ZF` emQ!3IqZ_hn4`k׬g܄.fov6/Ebu "q ˸`#A%cJ&8 M(s $N1)2&$(,2V!/Er_428*$mXxL+(y!sF&B6Q A<:".h TsyF+aP5a )FDcĠZޅ 2Ikc@% V C0kX/Ŗ2AZs)35Ζ ֗o^y02j*u6,2Ys<"lN, ƕ`ab{Fc,rk q!`QCp-l"]$Q,~ Y1Ƶ#AY!#e7E6b05]HC)0;q ³" HIvDK9RX^I!%Xs'p"e P|D‘"u@BCښ`ntEJP -V) |\#Ӑ?GHVs%D8P_]oT\*ާH81!c!sH.ߤQ,N>CA)ܠ<^,#hYU*\>\MKj j71*,L2V°,1*[ Λ1A(دeep1,{:,c O;J-[ZIحT!#†%L21 øB$x V $ B$5 "I]-1g X%ܚM"FAS'PE` hj @2Kd]TVgb^zQdWJI0 eRl׃i8]CUL ng4E<"?gquRkmb4ȁX$I:l&*A 92 De *f[%< L>A`N1XQD[h@qAa%"k,sɈa@VS`M)W [Eap^ !O H[CX"2ɊQ5$.@pqb 48t+ ׭0\n3W )e#tʂXV0+"BykE 6.ax&kE ' cpu`]fT<CDau %% MB M -OԜ;&:Pp>D v<:a{^o7g `"@(e۸gp!g4ai-Rg5S` 7K{2R/ :lB X3if.Qxt0Ļ]|,)i"3 l@c5܂~Si[:1 CHNZp#3&ªaQ%©[]k2Z-bҲ؉dpL[*i&č=W(CF%bXcT@(~$c6z[9?xVԫ|(kⷋ*njR$*K/ {吝fݤ2flFi*ϖ44[gKQl%w$zbBy6sA/Spic܃XuMAHsO }&@N)bfoK5}T{(V%qC&"2$ZӔŞd'=}`ZИ@c)z/Q d>剳g7>##Nj1)ɜͽCl@!Sփ$e i<{m0}XX{Obi_*:mpn]?ޕ8z}.(WJq{a\ id?G ]8̀gm boC{ƙݪփSjPڼ&-YԊr&+_a(U꠴OaaL~c]4ڵ:X\g|v[g^}:6蟿=~d[bEvDwXGptg6O3BA*Q] }ipMBAZ ^`st|CAt{7%? ?0a؋$|<(?.RXm0򰼝ί~Fto %~|s4*l4\gRR չԙ*.&/D8Ǻ GY guXαNXc@=DS] MDrjMXK( ,mNXp9iD(hBUt۷|hFw ԭ YMbB?]'7^LEɳ䷣U3@.SSՄݰ&${{Btla["%[^1;qnKvQa}[fP4{}tZKV7"л(v:4tVPh%r) |GdKF%9nyA#Am_% 9 聈>>+dV2sIߛ@Z _PB\~Ln $gk%-nU>XAsO|*G;іbÕa[a,ޤo`2ϓ&xG/g:+Et,U/qoTﶖ, h1it𔲵/hܳ۬lz$obG<ю>ᢌ*5Bl5{Г)J<ւ90͝[6IZrź eLW0'fIhA2Z:NgW? \=!7k5,ƖX&X 2C(wxäkB)JfW͞Z .J5)Ww8cEۧ8s#ZȌަZݦ\Ç̭&F dL "pz{*f5/>t71[.#jtͽ&E^fhlB߼POqs{{nv!MfګBJP,a؈}wRTE1|Sk/;q也=-E ުHGU=ƳkMoU\(݂i xaaaڱr T{DdҵCYQ]Np4&Cn#忟))Wx5j|՛aGuzҴվPNDlǩ\ I֟NAmKϒS!Pkct^DF!_oS$`J)hK\6q)h$kEb9Q~E(X\g"< { hӥH͔PNd Ydu"J2k{rf'rܞԤ]{XrcDX[rjIj(ҽ<t/G#zӾS r䶼 ݎ9zŢ }зAiW/FmdM)1+\Zl/D:>`^HüuPу[{u*Ct)5g[Bο nz! 򮿕eR{[TtݎI}s?7W!PG^;:8˄s&AHaݓ§xЗٔW Qx!=oj釼,VT4+p0`Ew6;M{F?t'Abplf`]s̍'yLWjrrLGmepdzw?v2 Q-1?DQ\Em8͆ydyRɇ^x_:GiShL45PXGR3L Gtp{QɽOal#>DrYf59xW6RwOq/$l[iy8aܘ#dw"ڗJ )qZ@y (֡$+_ǥ7[kD{(-&BwOEE62ћcͳ"ojo_x41t`5ӺlaNj\`5Ly#byF7W_CU>bݍ~M>|XәX ̈́6lZ&fkT-kH&.Kwr3D0#{|x5ow.++ԞE vnXj͍!*HuK;p׳IE\5Q)І-[uqtNJ4t}l4w^Ú/h>7Bzf:FVwazet!G߯!޵>m+EO'sJGfv:=4Mܞ:$qϜ$SoP @u'D"EⷻX.v$oA{,FjU+ONXȬEQ%c.O_.‚űkK98bH+IK{cS8 dSh25oP&D-I 3Jv{F!ϲ'zl^7ƋqƝr hW945AwĢ޺Za˘-C- (\AH~0a=>~q@( Cd3̝ hkO9L/zyW2Fz~_9a  M=V>I,0<]ĉ *Z\c:9176cTiYigrf鿫1Ox GpfhPԪCD-db6H㗔Ö7owâtvo mrMZ3IJ (иp5"#M@9H DЈֽ\ڮ=pKH}'7fϓk#@z 텫 ,!-ϟ.G!fR U>( sBLlfCw11:Ç˰=29?ӡ |>Gb,,9r6ԙֳ=,,kQcXgȇ@jc A|;%@3-#w6&jS FI߇.V/r!kk-ZIBA5Grh#54 aBhN>՘=GjEMZvէV&eb9,$,mY8(Vճ@{"%.F&+ W`>}EgJi4L{{eUߜ}[_Ar--&Od<(2cgi.FEx1ubEP`84חz^ܻ饰_3d;Y+y\Ad pU.mxqxKyiqlM^^4rFP IR8pdR`,h$eB>Z>Ri~ CO}.B>9'qhW8$]D,{kOs@V;D$/' 0dj[}%i~7u 2 '޻[{tn>nڏ#"Ou,`yhIW"p&'6+2W?x̿t:Bh3Ύ{(  `Iu!i0oVM3G8ѥS!3bY!‘ԃ%dM7Lge5[nC␿@,I0&(x+l4ϗ8*ƨ=VGٗQ[lP.BdU@ 99^K"޻|B\9W @IG^=A Ԛ/F)nM"P9 d:;0yF)<ꄄrjKQ&poWZp =\qp4܏)inw(5),d$ O͕L~NqDIv;7wꍜ2 ۤJ|3HNQ}ͧ5/.7]RJЙ(U|xo/Jfj~hY`o_g4%"&\0WadDh*?۞zӧOffndԝ^OE.J5"4,y$@8A9\nѤhv|׿"rE(b%91%Ib S,e@{Ui#bߑ&H`@Nc@ z>bA>{灃Fdja0z-sqlrvr9kM5Ҧe[X"pj_pw>@|ԮL'eL"IO~7cZckAus Mc{ :yW5]1],2YOwŁ  vq!C껑*[t0? ;8@$HF͓(Avfm Yۻ5aDŒb^/ ㌥XGi3Pp1 Oa x 0Gdq Q^Ȏ/Y䙠^JMBUL=0~~UϢg~rAiu^&Ճ1wc[,3(JYfY,tc4Nn,ccLD0=2swoyw/cu$/^1\c>UnOyik.6)=bђФ#Dz~{| Ma<_`<dI_h3-)Z]Q|Ax~+B>_HEg\͵KζR~-i7 #7'8sT7Nn˥Yy%iL(huǧ2%o{Q&MeR##%Xm>MA?NΦb?^Mωd㈣fAy<|^ 4)To\*mP7Ϯ'>M$d3⧉^T%/6DIS }_Ro.z?ۯzECQ\^:]|?^nЛ#NһlfFH\*gIm\Po;*z1;_يZvlshu2h}C0 {8c ܆fmn۳N i"A&\ӎ#4̆~ƧsLB,dd$lzaC=E $H}}PQLc9fΓ~RM)&UR=aZe%!l4-!UɃZo7ow5EW8 t)gڼxG?Mfſ 卽Bٷ \| hJ f@Ŷ\6C23/^ޱ7š;PVIq5OOK9U;v'qj4q '`\JN<7L5y~NΘZ"H(I)hS-IF2=@%FWpOO{e PYm;NgOŲI;Fd6*l. &%SO## A dsjo7rO % K4"wݗS sK3AW%f3eI.#BcHeHA1xa H4@!Š%z  *IP1鑩 0mPuuW(NSsx"t611eQ;L̓`Zh@B]]د@r$ڊio4/o#\+2i%ݰ%aKtˡΐTnvqU+*ŭ[;Lh *UfxNn"j-zIOi݃2!."LʶPby\5ӹxCA%ono: *l?[_) {L4XXDR",*2!3$<1Zu+uRܤ "U*:='M5'G,ܤ;ǫ\b"$IALIWwWk7 49V׈bזyEQ8^ l0j#r$ (L ., )dw55߈{ `iÎ߉(v~8j;}+CuW+n?쨕{i{p>m9.b5E^"(忽yo[tw\HHapZ#frx˟,AR55dIS[ROq]N! :>Jjw^Evw(X51w>۾18vF ڄfj.DA;8 GTiR8m౟h{\ܸ)86~g%0İZN0D$"Dv~(uaL`:aM 8Ĭ}f8/nRՋdh)FH48ѐV NLB%1T<# U$MѰJų;T]V&Dgzvs4џnESݙ]p.Q#"UDK'A9]; 8 hMbɨ $ U1y3mשDHy\Y\9E`%""W2իr)<˙B 2'S=s 1K?{۶B_V־g;Ep%K’\BeIGҴ~,Q/Ғ [H"sJ [c>^x{fl$c0 =QꢀݷUk<ñb\N_9a b;h8`8uSFA>yS+?#}M7lcdotv~3d0%zśucg{& pLgU|2ѸL-0LALʑo\Ddiݼ]Z &Xv%U&8MA|JIHl_&!!߸/SDqFR9Znno$_B8+$vsw4ư!s ̃@Q?s cy> !<_W1T#(Vz4F0ʟ7K\ ݉A>8 _l /۝Ӷmwݶmw4l$BBkF΍ % Y2Lb̈¡ 8AJ2`mH&B$QssMTLzt*ڤB'O?>ebu J#=ěc-rj~JԈKԈn>5"/JCPP3dLPDbND -$%!D To߁w5[_0P)on-2,Ƣ 0 d=Y-( V2a*~|VOHі|sZA/,խ֚Zn+9,=eņC*:EB ґB AN0Ռ'88 L4nLH%s'n^.}}%/KLUY~Faq9L%JDb3}dž"%1fZsj Qh m$C#$2298 1X "BiclrǮ|p8=(Jj-dW'X\Ӡl훩D_ӱ~/[Ⱦ Y\. e9TO޼{zIߗ!-y0Iq v>-A0i"^A߃2,`:g>,Zp?;wf2}5O*ᜋ?߽w)Z-<^}O$b3W\qsFBʼnΝB@pv O픳}2JK۞ ;E& %\oD ǣg'f7NAXLZf| c*$ЙC<#L|/\x/(Ż& wv[!a%Oioc~(F BBZ Γ!4Gk=i{l*ZD?Iy\4j̵܆t_dxr"p(ǢZp<|7R nYP(Joy KO^-h<%z k׫??㋻k@N?OMgV~֞vl$jqp)LzG;6\a*wR"SsOCՋ*y([P[”RHr:j(oD ~v~}vdC=7 B+_mry)V<9Znn$Z!-6n]6$ƪ͢k}}ecs w[rV6IHvb+W{XEYd $o7qGN(g;ellݵ (#rxNNWnџ,l("5vp*Ag0O g3\,l3kGqkfrRG~7:BIh'i5eRD;-Ľv~W0&#W$_dΧR%g늞dUP*ֳ h6jdt:t0ǗJjy?}phޑ[?4s;BFäW^LUX;>%{zCzv[ʼles,(ёV!, XCrYi݉bFt'_g;Yq@U4>XpvT 7 TR;-zs$M#3=#蛂Q=0Ʒe's'W5;䷙V-)Ը":'Vzjkj][ɻM1T3KҬ}GF~\>XRV9XHtUKG}>v-86tF8Ao{LO-?p͜*TAPKMJ 7NPR+S%ipy 8JLм^}R! aUJ*%oYz"A:ZRYXJ5ڛ&0w9̶;;8ڞv^ y hـ#p^:8K`xJ @2Y2 Ӫ"kJ9%9s]@2O퉟uYA|"ѽq:wX9vH,@fSI?]i np$+$G0 Tb{ʼn\e&).r]~9+g߿~ $,{՟a"O̯sGӓ?n8BQːsn(SQ$b "&"sA CSҶ+EY_PwWǹޠ7yk ߬ޛ^c{|Xߩ!tIk6Uk ɦ*-9y1UTЄ6yU[+[&W,wANҏOo\GБen[Pq/q:@R Wi^|QIϼ̊ $!߸F8=J\xenO1ܾ-+jvE-! #|/ @UBB.}񕚧U|`7_\YK[7#b҇tk;*}B~}ډ꺝ۉ꺝n~'j0CpC13H(X.#X'8Fe$f1E`u:jRL._K8C՚]s| 9]7{[`r9M5od @_]}8<ݴ*Bjs2$ebvJX׃ /9)1~ng3Ig [z6 C o \sEUb#3}ju>q ;;?Nwc</%_,VExVxv $_jzKe$$"5!bDk#Lb䊅(LF*M؈XW)8u fV{78ۛ8+ʗ:,)?e{c1&:R늈N0y8Z8Huq#H(il %1)H',hk|Q<]ff<\zw QFK"!Gi(I٨7S;vc6 ze|ij, ݛwOϾz^[r{t!kt lv>^qt"&DK6{?\`y1L~6^jUN|ows.< ~A$AoW/$ZYL]g>q Hs*AJ*o0;&q'>qF%9*L0o١rҶgP|rQfN\2wn2@7sTb8Gr%L)# 3%TKn/)Seʴ"dky-mU᧮}X*Rk)FD}]wX3~l潐hw#eP;T0:S.׃٠7>A_ Oo~1郍&rl]DhyG8-~ťVzn㔍{FSpt"x'N+7Վa]>`joݵ5sR%iҢ;)59GE7D9,H 59Q3LhݍkMB8̡w7>0]R%* '^|L`'hrjP/ue2T)J:%D("0k@^+Q7q:4I5{)B`$IB/Yy0=F{a򴵦(=$%DUYdD֑qfDfoREW{3.;&M?>y-,4z ep#b(5?Փ9UlwQZʃw4(@\ QZ>dbv!?ԘM!(bL:ՙBbm!dNvGPjRSLG. D 8pu$TZDvFqx5V#jNPkz8ZWsVSwYh2 HF(1s +.=7EsY̭zoܠL)Kv&ѭWTݠFU_xpr̟}ͯ3Βc;)~7и'y<;&2u&3MX)Z]󄵕> PU_Ң辏>OP*ueѲ'>&|ˤo#0Ķճ7)J 3w;:xs+r9R&2½PN]YUz!W@J>}3}| fx;~y#q=n6܀Auk@wXm{dw((f埭C%:԰rC@וPf=A*lJ L#0 #\`^t(FW ֫503&aSɜkLDuLV[ʫѻ0ʫw))17 QS.#iu@R| Ê(6CR "wJY`RHn=jnS)cX(RK*E5N:ᵦLX\dԒ!1 ۓ#Nl9]M#W{s_e#&XKE,.>܂;O2ᱽg5wL35\f.ۮkyҶZu]^Q&Ң+<\*;osR?z5({A«ep r{J|50-Cs<`cFX6j%} )ASe F Y0j ӘTx\7{8&,)?rM09sݕ(.( :ZY̮yǤYUJsvCcҟnt!KyfCǘr6'+lλ!L=SU9wW؜wy!)S,Ib.,GQ); * QqCCS :!/_ygwL0[ kwۅBTu Qc,> &`_.u9dIS@BCo7kܐ![׽2Gb' %92NDkgӚ`k91Ϲ =\IR2GΙ3ZQ{ Z+s:v?#L&*j^+o[ml Qx h^+O+VZѼfW&t0Zs(Ml{0Wn#bD)l0v1{-,Tx2sak *r`VzIRl/K/`ͷje"~1_o v'Ij7pRm7}[^̱" R"HPP葕X: %@hૃc쟉kBMx \W+%ܴeX] :p:Vo;gaS_mA%۟~ڼ3^+{OF5/!.$I6͆`EfqKSӇh7SX^'Rߵ%9Q=^H\xo6{LvPVO6^m*|&byqsDt/@p ?>z,_.N_&*#|}w{ЬvIP1)*AKm񴳥%7&po"L̉ȃ̸=8XH;؏bQs*NB$GJ^$i+m푊TQD"Ù 4&uHᏚ.YJl$׷kFN=~svN9^ Sh1KR`#!CTi*fj9wԕ(T"9x*B_7GxZmmbqz{jmC{. e* phL&|ۻggАBb f5g޲qTg$\dߪesz\Đ51I>y^[2RheMz0{Yi F-F\#HOUfflJ{9.,]f2.{ukb|`+'֚ JA2XFY6FB4P0o :>~FE% #ꓚBI{7cSVEJNifܲpQUG6#MWXEIv*=9$\pd8 )pr ,=~\CnMڏZRQAmA3l6qHj(HcZ&q \Be$Q g9mRo0G y p)0Rq+ 5԰1}aa!PJ?9-ׅ"J]$h$PF}8{9[h DYQAr +*OsFRrHPob9R\qrNIlw'<wDSlL;Dc.L!' 0c8NIzb)uLaA]@ 0巪({dt'5R7:f6,49эb5e%"x5G\`kJ^sW0Tfo|/oP"nۄ1}? h~}s'I$ɡm<Ҿbm9vܤ?ЙS_ Dː 0<\Ec\#\yMZŨxϺ)R'gsUېHS9ڀgO3,"})g>#Rb,pvCL"KfJI= C!v{śʆyn#ڛ<Ϲ2P`Hs*V]{vr,P6tTw繺H&iR(׭@8WNIYָ_h37V¯"!k q̽.rhVΣjVpWW޾~tP;~ nYb \??"hy5 CS[nq=@X pktIyD DFdM$ccW䏗y3gv}j* cw}dG'ɎNTh%: ~HǕe:kj8Bs„"PUjLSE]J_T孕Y§w ǟWb=8'7!(u/j.O1Wg=s c~G|ʉ׈ʄNݦ6[;wkטvQ"6;OP٨^Ky"DVP#D'ǨzF9>Tr3r# wЦKg".2TA/Ŝ]mxBP|%sD"!8X( ß! ucGݑ-2{0t!c,faN%`8.iS F,L4bKLZ'VQ0RIdw(2(3=1b'k0R!r|Q 1DBٻFcWP}/yʁWlQz!%)R̐ٙKץj0Bǩ%:ĵLlp)q?y7K {b& c1<i#Xɜ>V݊d_S%k&L7F2n<*$fapg16T~p ?- Yǔ4GM~/7PΒWsL a<5zÊeiTJ>RZmxeTsriDRa ǘ es .0RϿʞ^b0 Tf@wHDH_C" 64 ,'j跋`-RG'N8D'mRZ ydY&,y^ۻ@6dRDY,,RiEX7ӥ?/wE7".cgW9WI8 'ӎRihDYN~8a 9q+_o$;Mt ʦ gsUc`W/aB]j%Mī,h"p}taW^QynzNrvr5fWT*1BPL"4ZIIeHQKtsAF|uPZAez0Jѷm(DP4(%n6W^TBPAs+UH9žUF!kO$dHǓJ9 g.:B(AҺ!ޝ(`KNmV0i`AF #>H5$c!$uFq'N*W$61  U_FE{ ؙtHhƜua]x&7~FZƹ58B SD>: 94q餉9]pyrV7gqt応;~G$߻JY#QYz[[5_~zǂP&oٷ~?97֯6?tqqe7Or%"j;3Ӯ~4%~2qV7k?Ɔi4N:ϼN* R:LT: _ZF`l/峬vIv`ڼkzB*:!(m;-Pwfbg9h۱[hKC"]/a{]+_Rmtg !LlHX7dy;#$J0Ԃa'*0ZmC}٨@|nmrh 5Drv2Rѡel$%0 = cBs@L႗,3-p R&"2\#ńp3x"3VHmH}H@QB>*̤蘄$2@yP8I, zFyF6mY`6]9|:sI t,4Q,܅9bwDYĄS@U  W1pCY/j-YCš+\W{ A=-> M`͉F+AHQv?"hj1jV6զGŨybrBWP% K>,9M^iD&R*RBl乤IGZVLFn;Ѻ`bJSFK#tx)+{KS!3 x1GBla7:G %2@`h 8QDIjIGN/vfla(Ξ0&)×;4B1p+oC cg ֮S;u'<߅afy,1|k6ۂ ^*HWpUv˵/k7Zho?\ZJ5+~&bs']ƣ_]c2ƣ3krm=>DEF޽^j[ggYFW&[s飽I^:[:H:|2w'ٴOߟ_}vrK'9  `4.]Sǻsދ$ƾ d!26eKE$٭* @ٌpԎ o!h21> c:շVUK3]KFR=q8ӎ5bgިB0SLSDPJ.$ OX}Xش%AS{֧;-085 vG@%R #va;?Jɮ=sYH ol=VUM.R_lXr~Ҟ.&qsgQ%(4s֘!H//1NF);{w/4Laq fN>A"42^jY&0k?FYӋK jFb^~iJKF t)˾ %o\>3 ee|pf J%L:,-U+ҿoTW`>fG>hslW,Z1WRj7OcYKphVj/IppV^bpf 6{ޔIno\rS*}渻8nԜP4l1OlL4RiqW%]Tz H=t%. m pAB"#{D+W)F IsCqn6O eR DI xJ(DaS@[46)丌X)bK,cz;}:3hfpJNe _.W'eecňi4϶Z y$V.J }$"be=v}#`=8D 0kF,R*,Z M&c'F&z=d/&qt}wso çw1w ΀:y7 v27'c+(uuA4p RucM^/u X^˲D7p+ Νb*u<j]b Jb.]fYPVxڀX[1H2D sW&,WڭVPBlhYjs҉-۵UaLM&G𱽌}7PÛߒMj}[BQs@Ń;e0. #7|! $Qh Do8|F-sی?l0E?#8QFwelִqr:e h\pH/d **hMg$چ'b`8uZ+O,N2|hVR# گ,W UW*Zf,y)* Rikau$R|$%e!,j&6۲@&^Uip3;WH2Q  Vv|i{׻=MIoOX ==F90R`l#ZI2 N7K x+ [8G.ܙ.AXppmҜy,2ce{B_;YrOQ2+o1zUξC jiʤ22^Tj'$QRMM$42A(z6PHxb #,(9}7S8,>_L&نgk8;$"3Z=.NE}J#|qߢ-g.^u Vbmr̈X^,bɞ.ilŵ֙NYڱ`Ӡ-o)BV,+xG9 pyh9ݵAH;|`6CyGlLyLˆ$4ٕtSl6fgYϤOF<`F ) O4cԩS,E)F8a Sja&ۭh%DwҘ\ =ɥh^Z$%rj(IQ2/.3|!tAQPL+-Wk@Q-8mMPs; W-@փ #%o}abZ"ij%kE{behF &U%+!qR9cY6|E|_6E%e;9e9el45`r.{E LjwחdI^S r%`Pc*bIQMDHã̗el9BO~n,z9+DP q b/S"E /S6$VZ Q&A- SVnwU=LcTcr-,'h<[KۻYB*;L,9͞LTp)i^pNPԀgwN7KS+H`ބ=riD!pN!v夀lHǷS V)'E 6BK`Y`9#'%䳙•o&^Rߌoo>f(\4. cQT"p^g眦Z7{{:pOftbC PZ#-%q#n ִi9ۣ,&3ӄr<%]p;ڎ?z}>۠ q!<}u+\Ib]2Oyt,0x=OWЁ~Ҽ1eQDbZ6?{ܸjNN-'_RSgfOnjsyJj$H'V6%[%" G|IbM,`wc)*]0 FcdiQlKaXQQ Sf@yέAF)c*T٥5bN0٤v +&5MM?ub.1K/5sT 6Ư)Lۜf X&T.3)[=ͅH#۹틼S'ji *@s܀Ҋea֥9SE%hT]-X!?ȃhwr} 91QтX0Ȋ8OLlnO?ݗr/ i5STLO.2,y4}J'pXR-v3܃ŵՇv^qLG?G? b\ǹu[>.;;\շ߿"ܺcGa[I{)YI7Nrxp<=w*?wkCkwBOu0^'Dgno^F)J 4ѕ[VQ(&OSaa  DXzz7:)U} ."'=,Ka]Es'B-zB-r,zB-r;j9mmJQkRt!EbmCچ4]6X_mHچ4G&= )C# -9v &NI`',BZkJ.8VbmRGﺑrmC)/6LLl*qkju!K'Hk2$/6ts^m{t5XC` QS藇zJ`yֳjy]Ȯjtj+y`\V_Yݖ,Wug_nZ..rywmf妎3NO 4\;y{Rg· }U7nvm =Jͩhpw>Ҟk;8D,B︷T{%n*; #!.>2EF?!WsFsngބ wr&Vg&iO62%:doNQĉx??o%3y4*?,nՑXTa&Tl3a9TCC|T[j^'4NOhȯMgz9>~Aî^} ѡ8s#Ot0dxoMv=/k:4DG{zcoLvoBk 1y'd!Lx<ݰ_a ,GfShu]ya!KTQua=4_6j 7sjFGDjT?-WZpna:p`[4"ig7eDsa4% vwA8MS$ pDeF۵=dAqo41n2nM+j&_*ob+Kks@mjwܱ?^>ڰ|fvbqصٷwݯOL_fw?wUZ1TM^^%8/Ҫ~'OS$iqQ_.ϻMx, 9?g|?{H|r?$y>Ꝺg' ;wmnzP{$/};/ " Nڹ-/& YMFF uEt?~ޓF`- * > sd=\ cIxth<==9IIl9}!L!.wiZjT:Ҷϧ^vv}'!LQ1bOvY zngp$նwم|lq:cH@{|:Nr{}%(V7cԊwlkㅯS ,Jtޓtut<ܣQ(<>(W$9>v`)|sHwFny):}ԔZ;rQùy_J8~v5~]n^ESƢ21r%lxQ[=|\y٤;לh'4D1r 돓֯qjx<0Mlu%/>aFɤ1ۈWbGGG'mR#zΤM%Al*D[<'S!{ޢZ/6߆GɈ=3!f9'/쎑du>ƺ((`}c/Qr"PlDq֣12AyuL|i$%ơKyFĜc 3ZQXzO#;VY|팲gFqC/P%\fgn3v>;,ԀP HOa)qT8C?tB!H6*0Ra B/ w^FB{טs9.F0pF5*DۚHx Ø1SF^ F`TpD%U; %U׋U:xQaQZ.&er ar&0e kIJrBM^Y f3cQ'm-cUiBF rVTi!*FWp%)9#UMQSsn5J .F0Z\r`^(&#3" n ,/#GH"p|Gvbˏi]`?vef{f^*^Y]7:^%}1hRݷKzͩSvi>G] 6'ElvD>ӿ|}(y|7߼'Nzv4,gơr0r"%SN% 9nPbDt|hҥNi76n!$EK&f1 ƑvS!h7_ om[&)uDDjr"%S\ >YαvSx/MDǷ6-]S&{nmBHȅL 5ZGMlBAvK(攣޴Djr"%S][gn4>ht kN{M[ p)ڍ /MDǷ6-KSڟvhSBB.\Dd9nnI 7F>Zn^TDqD[nr"%Sޱvt/MDǷ6-K"{M[ p)'P_3Jk AWOW++ѕʂQK$eӋ+A1k0j'^MBMJXR%Jbc+ ?PW ӫ+szJ8hO+A~Mɳt%0Ycɳ6yւDoƳƔ"gm] y2y&Zȕ qYa]gM/xFYɳ6yւ%F&ZgmX I6>Ϛ$\Mɳt%YLɳ6yւ{n>|?w6shzp;[ߕfRT{Z1ҡƴ4#$ d6 i#TV= < KF= of5/YiWstmf\`l80:iZIp2TE8bG#1p^=ka5ƄU"ʱi|hQ _o썉-|ÉL@,MƤNy$z10O͎v#\N: 4t{%ɨ<$/='ܒ4#*?4' d6n3wvxg&p:~\c?eo>ܫp]|򗗝l2]@'9ԁ+CTj.X~Ov\+Iynzm]2-{mj@i&dMg& ?/ӯe;K& ~υۗA|l?l sU:Wi\UμWAQm`O=B;"NIH!:Pa64쟉7%_˔?,S4<{3K[&szot̻Hj,pgR 2moЛFf\~zpW+ 9LD[Muv" ES̿}r ]Do&I3_zWsy;o>*MW.3KQ׋,IAWKSU[[m0h\$x4@X?#aULS-myi{"\Ͼq˲ǹ|KZXȵ.GK'"s#S4Xh7E#жQHȱ\+^μ@MX +6~*EvW.{UjF"M=9ݙMrJ; o!ҁqL}=UqeN19)r< :\%"U⤑B p$UR a#5QJR5*ϓ 0%S[ p@a~;e]L⨟극`6I@D\TDj>Qf+.RD _Iq pZ;3H5d~,K.,6N*(#R +5`9X'b XK%Zn1Q\3'B  $a[(wJB"nek@ (iB Xa@x "U}b:SSd}Mf  DE09EJn{<]E,LρG鮞, D64E@jEfQDWQT l4 d3J\]p4U'+_t ~@¹YNʂppg OŤ? g(xke)>ջ#LTUքd94Fƛׄ*&ĎuMhXHAo0馐+kBظMEqlM~'?vxr]] a _ڸ c7z8ȅx'*7? `;2.TWJ^ܦxs,1C6iq.PtQ0]s%:{LxgeO?@A$md#LV/U9Kf%9 DofٗY.ZiDoXh`FVjBl<QVNt**eUaPw_¡pƌC<5yb"Vg QQ",; H3mVSR)usIΜ&51\`v;܄*e{Z?4dw`C6W9-Z9I1#uI@X1Byّke||eghQJt]ipy1&:B|g MlHnkB#JQ$H@ҺZb5GͺDK  KU Fsk9bQ\@[3ROw#{~έG͍T8Yp@4!sh̜2D 9+7&0ͅr>򾐼wmޅ|r̹^sOw]F QrѴC]{Wzftx+1{P;|J;EajBܤMk6:\m9xxad{7.1QN:.Fu"dQ+$T@ VͨӏP$ǟQ.6`$XJLTdGԀ*G1Jy#zX+ۏmv7B8osβT-oTՈZ>}YF-A anQC 'RhNPʀ #(: nBQ)FocZmۓU7,zྺ┭c86 .kQɤk faț; SjÕrg' 2h9)mQQڷr{ +rv$4 )G8^Z דEѴbX`l%/\x >Ͳc%Gˇ"qX||̇n8,]k(3g-ݱyΞ9si^ѴAq? -p&)%ۑ]4g+襦E_iVi|,3xK(gGeZ6)rvvc[ݚ4cr=mڭ1u[7F2UYF:VvK-vkʃ:Ӵ]cFiƄnmH\D,SS=;o<6[$roDA>1y1Ol(I 䞊3L92fET178ExUan6fz?ԦE):^$Ih6LR P"o?_K-6uUdΫ})Yu6&ލx4W蓚iUN9Gh_iU18}ϣC_zZi $7+tZ^ Al_ U5p;;s چH < C/`ZS '/}$%sZi R {n"E$TM?;hq?1s]̀286gBM i'sPZ~ +v-O2M X oYGE"PaXIʋ;q4)[VE-w_fY~For߮s6Wj]/Q6HtbS"D`m0r]# !D0"H{9o0V IrSBRvR:"˷|ˊfӷg @&[~:n/I-Ja)0t. a$b/k"ZZMS'U;B[* ፘ c 6Zx`iFsXw'mQSk D$)L@2O/8`*"W&!V ӀZIakz2WtT*0jV4NP/R5++ԭYY.#g94LpY/)XׯYy w۔Tk9~0 P6=zďkUXU'(EG(15RŠGcC12ĢZ JȉuHjt1uK S՜T:U׆!aP1͞@d2<k7>jB$?`K~8IkIjJV 媅LEewW+A`C Iʕk0)b3i C*,E.݇yP!+ZlԎb^ORqOBz ػJv,`eh]ꕍ.bc ?{U=(ֲR'ԬeO@1,e%~6O}/׸Xӛ i\ lNԔqzɴ,k4bv!AF+G_gz)/&,^6LAdzzXUZb.[pٻ6$ 26%OY,+;YJ(eiD(UzuuMU= {ol9&q>QIOPX_HuyS,ho𿣏W)xeeކn2s"9$.mBAY9΢q`dP["y=(^h}$%|xc~o^^% ȿ~!{E?{\@v jwV{5_t3(JA[|z(+qW뿾axf7y>mc<g/?;w%ةSGU1Gn7ӜlG 1O(y![& ސC; ifPqtsľԝL`/uhQ}2o۽#l KF$nKcrH+Y鍉Qke.Y/")HHP uR;2GNUX@ѩ_N&@4-w $JJ],T~=Q{|ChkT&VZiKh5S.>3G{>"vNJ2F#֊gi<4i%xOJ|Mk{c e*d0eF= L١`:wE$*b}s:5(zΗQ` 4:z434 >:nz{o(F`1#iF!pl )'S*6Nqy/9 d*8Ed<%.^6_G+\5^?Zt3Td!}4{,0%ijֻ^Պ+!0ߺӫ ~O]uhQaLsJPls|)Brf!8}f+@PssTe>3͐I,!ZqKQXJ"ȨR4)˸!(2L&훰*!ܿHZ@#'#CN&kU6T i kU|3/<Dslŏe {דn7{6)k~ 54s2 %Kckar^u.@CfS\$Cu8擫c6g{[§FmvA`C7ms;wZTPÐd!RHptԉ@Wƃ%N 孧Ťz# AZ@$T&ŶA8|_ 8m'6R'L1*O8^ #pā x\e3ْo ȩg˝9%[-E܂R~)T'xvysyAs/pVUkي4+#71EbA9Gە޾{pFGJ 61nW:^]+mO #󦕃zvM+[274mo{%_Ӂ[ٶ {#t=jz2Lc"89KK1u;eNj97Z6WK0&dj'UgU/)a*70 Sdr ujd9|! -4?3ِwpwz@Ɇ dd钞l}> mDӉ]va8:M6# cW =[E-m͒aӭ^Ig4ӠFóZ-956d{WǶ sKRv˜DDЊ&"]FϨ )(HH DX:9-(v?ɽڐ$u,:8%I>ieM$mI*b}i@Q,]EiIU#[3 l]yNjtlL rpQoUi&;*FR ʉ tJHll.;g 37yo[@k<@ſF_ETo:}Q0?xx.|:~hAF~3ݖ:KJ0Q9HJ٨d@fBv6.r?ڥal{EBu6QXr SXjZo\Ƣ.oxe{z](N9V4QX)2'PXB%QX7럴YƼDSl2_TAQ*keJ$1:R9N0b튺bȒr~AEgQ݇C}DC*Ygw+&z8N Azц.aݵ}#isvM\KZ]"D11ܡ8\MD FgsX}.[Uv&87#ޘ,\T" ] yI) :דe.WGȜe|xWA/xg2 ܙЄ7ʥ82Ӱ#ƀDGW$BgDMo}, !^U0 +[߄R$arV)dyk;%fbCW n 1d$*'< ~J(Pm827O zuu+mڽUΫ"%GGkz9@=KVP2%#tz [4 I˛ˆ6tH ӻLq(7Qn~,} 0Z# 5 <5uVgY:kfb:peWxd2 EuRP!1e:(H2FeI~PPYPn$#ĿsXSg DTQ!qIkuu`j]lNsV=z:"Z|f%|l'7 PO8Zj\|f<дSf -IdO+SNBh`:Of^ͥ$V)vgvJp6G)9I&viO9\ooJRSIDxJ!Z҉?"҆4\]\7zi=*_ܬW Qw${~JӏXnתL W#D y[zcha͖CQ 2s?tSA 2`mن@IiFsViFW+7ز`:b{2HF+qSS) ;* t_:t2J֤qL6!EkC'k NKٻH1d$C-#:-yq+tWu^۞q`_ص-t!6nz=v25'%ޕ8nkbE&#zTu.IЍ^n^)P\ۖJ( /5gR R Z1Tu{XNԐ2"w[ϩ=[=}ty3K`8"!4"èn7"JQ'J{VZp 3t.^1%Dc+7[F)c"meNExjh7:pΔ\gbwM*=wW/?:{KR"+$Kԥ$DZII(4pj,r0 HTSTbEuvPCB(\lU3_g3s ?_Qtz`w7W+5 #'sP1^p'C6 dcBhG E:E,u ){i) `SlpJx|nGߠ:,{j띁|q߀ӄiJ8Kp'!Y3,VYiA $WO+ߕ2~,owš<#lv8*QEȊ#e)N(EI *  #I%3 :YL2_KU!LHּ=dtN p"acֱTZnBʄKDF) Nv.ui-&+[g||SC~611Uk(o2EE Ow|h«2ͪ|byB{M^߭*ĔOnA>F;\DDhclx -.Qz;!J+z'n@o n~vT("U'ܟ B榆m50q]$,3Zr|8"UEEM;F] rZh)䊳m $RհO ^%G yz-C\ drQ\%i<7:ml &_DQx ^:#`B0u NQ0(Kcbħ/i HaH|,sXl4@8XyZ,JkHǍX3]HQJ LRp߫NkHC'ڀ5o /Rq`rfG0`{Qѭ!,P9:[sML!\vĒŠL5zQfNL Q]HgKQS5z9 /aIq lk!| m`EL:a|MVgM] "Ŕl!E L iom .Ca$sJ ҁ0LR Q*5>Ei̻HkJ""+"ij")fԬz:PrUˤ*f˦cLRG볬ȒDSgr˘.%V SV)Qo ڌ ;KnqgKsd"i|0,AH*T܁v< }/5f :(y\oqήjQ%7J ,|.3i8UxC Y\$u 4;<G?Ĩ=n #MѠ\aFvƥm2`嘖" 'D] Qf+PW>OH sqQ=j ҡA Lt<̲35\d,lMG=8S;"=ty JCӯIV̇= _%impgh7η1DI?G'8|CJ9P l;i9YS{m%O<6ň4!FW&QʌHyR˝ <{˜רhrNZK0ݼ$'l7L#k*>n_BvDN z{J|< 𫙸XZgaK@gosZ^|—m͜V5c=i8kǝ l0N& XTgR-N_\3Cv/X;*@v-'vV3=&Y_wI33E$1:Lѧi0-:$Xl隈Kq<Va t4.ࢴ%.H~)sI4Dꔰ] ͠HO4.)fqX8^BɉByIfTƄrvL.e..EB'Ѭե|>Wx)Wm?t7^\HvDVO`rܹs-5&Gs/;$n6\X{N֔P0k.QP3qHjzttUƅS)o3Xd-FJ%!jAی{ͤ,xe cMi"2vfޛ6+hUlbRL|lݹ.Fe/3X6sz396ǫSU7ޚN;YM0|;l<݀@ǀr . t[f~zߙqglV |' #ӫ($}Y6QbیH]6%!k>BOd\-:@uNfr}"-EŞ:Xŝ!` 6H N8)@* ? lx~:Hyd8w L^Lo[K"bkYΝO\ L}4Ry2d0ƸDGn8$&/05ъ3K)J),'hE)#I1bYKu0S*Mƨ5vSShK?F'{Xetoo%ctk,_ ILfq,=[&,YD]R.-I{]v}GBoZiRm7w?c&Qz /]V͗dd"7_l6&eNBFyJ gl Kv`F ~?{/%uXD'wt|>[拟^2_.ôen/S̯-ֿIz+1GJL7^=rt}ifLh6`hXS.iWǵޟlMzðiBw*r$:8R.NywRD)!ZɭϳȩV!XsBڬ8}^Ui\C']o99nklQݲm_^ﳑQcB=XLoqɁ.=tW].VfDA*ObüG.xKȡ! 2CHq/k,)QXyr00+o^֝n*!Ju*vOIJz$FSYBAT6xO"F(=Ӡ=DgZe&p~b%s s-`~hEM1F˧;njs^76o0 IWGRb\Eg} < r0 ɚWЛLgKpγ*Pe tflrgpDS!W8)ZZX4Gp"1Lђ!4*j." $ns˩45)撵m2Fb2S.VA{3iMQvb^b{t\]1~/^v}mD)}{{ ] U5bSWݚguLߤJPtY*h6!dzaf>9W3~Չlmkj C-rv;,4b.bx{?i8KqcWh3vh= ''Cr59"ngg4 1noE]Je1Ț¨!L6Yzm}/{@!\m޼1L~/&e;fNud8BB[W"b!KF)S򧚩iM:Gj) f}.1Q8Kb4l=6${X:soe.*gqUzO)h/@j*8ot=-8we>WU?!V+hIBqd]f㈖<ۄ.I`xK O9dkR.>",0G/ʼn} ^N6Ld$;Cd (Q#y_6`'Q"$hfϬ i$LHbs[MB*8Lئ +EH W^u^O&!}Sݎ^ugZtbM擘Yo0 11S=Wgz㸑_,63x)x_b#Xe@h6ٖNdɑFq-Esӗ陴mM~_Y"ve,+9PVLʪz(-n¾=3Cٙ @S/S!84LDaP((q5*%DFededUw4 jcg`Pfzv_UJ\FCiGTƌEkC*Ȕ "ҠƃQhSR*/b-\rScW4T/~:*׌9\!!&K1Q(nyK9~L!cޠy|Yt 4 bC(K-jf09M4jmF6Ch++lw<AjEkyqڏ4|ZHj+c\dp+/w(Daǻ_t7^_JF^٦KԕsdDWL)6g?s=#vcL431iWc;~RgK3=&rw܆v헜'}`;oְm<<Һk]kY_P2y29Ky!er]Þw'yVu_~NX}T q\=h'}][_>gObo;Ώę뒦eK A&´: RMp̀j:HMFPȰZexfMfژцP5f$jފ0x5qe)l$Göh3mŷC menA[ kFRWd IjōmGBM29/h8쫇iBHL2N1m/rm$xw3/o'Rz]EuUVr3 (#QEȹ^q|:CoK}Q/u_~P+0J gKQ;c=?ݹ)PC$< #1ĥ ku}#oј袵 Y2 Y|`"E8 K"1]8Ҵ!ӘHq?%(uIyPLo5Lݞw+)fO`H'J1Sh]hUR i,hG#3)Yd PfeSsBrcUfxe.%C5ڀ]Q I g ,_.ٞdL - m]6M+X@pd< 6hϮzI^d4Ųb0hUAV PXr[MˊV4,)2VqCju6ixӸ%G]~"/}hkQV;k?gy\h ތ!?TEvsFOOӷ߄)M7\ 5h/Cv;śnGhŬqlcҙ>B𿿭e+x3Ƈn"% }=Xssn 9ـ<&~LT/.o/v>y-1` j6H n;ъq)Sp E#c@Dn5/3?TVjH= ^ֵ3W&7"e2G2\,E@Ѐ- 2e>c@ Z:B%+@MڶZZH'MK1o\188 a 0R{R1NKp\[j5ʉNnojmHk0:쩓bB<hŞZl6QԼ)&5oZWdq\kMͤN gB .Uj%9JW w~!3lw=raև  ݥN?  *&auO8_E%Z3;IJYA<>ˤ*}xG`fSEDܽ" ~+?P~|׬=ӁYt)(q c_.#Tllk.\f(zzcmpV QH^uZH>H- K%y(<˕2 + [X]gڙ?7fLR6Hsxknjԝf\w'TZi=9lo%;xddS`aZ!f< *fd*K2\pVYb)eٸq V RlI"F.OSS\AHrs2%UļiGxr#c$u;dYƄUweRCAPIxooC@`u1ً靶RKc,I^&5i&ءΪS6c,ѴA艒RhQXhJԨ-J李%7U͠[?)뢭%~{c;E!6XОYQd"xG&ű5dhyPCZYMKvIK,4(bh; Ӵg2n8Ye?D2Z;?I-jRe/x"%CO`i,J~N/3K>g1 *pTtv\5{/_*%e9%]$yz./Mp̄ pgһɛy-bݧSH'4RR#ҚQt- t zeb^>PGk7j@^/V˱n~5k,kN~ û/wPis|֣x5d~dݼT.͟#}rVCYVP],:`e?7Q&BNu˧?ҝ$ϐk[%Td[( IGC>}D?jy:GaGcQ5Ż|,m $@FQOTy|؏8U[^*HoRo k[o٫5;w3;{H+vUR"ڌBV#ѹՏ#˱xzDx~/[wNn3*2=#b]FFt}כ80*!A\?=`RtL DCV[3 .hZGՇGZ6V8P4SZ bTA"$+e8sU BT}f[fD*?ׇo7\5493~b4+,2l5cQmu܀ѪX.?kڷ{kҩq-k.չ8]ś4 K D]6ڐV@>4#|>{ݴ$k-eh^IsQ2`hrS`0A@ k6{m{m'ZISTϛՆ\t/=Lg(24$`!u ^ 2p>^pԎ[:XHeXɬR_0?_B0Y/L`qX01 9M+K'sYQAOA1 f$ϖQBn4F*E`xF8jkj I:TNeN eіƇr3I89K)@'3/ &~t=,AÅGLYW{% #CeWN:+ h,G2Wu–2WULv~5}Ȣm=[YLshh=Ot0o[>存 ^E<״q6㯌~~  2>G ]hP./[$TRz94;MHїl_%;I`xb.Q\SE%˪?h{-iLt$Hy50y_[};MR׳G=]4Nm.seO].^=蓜Ż7 (Esm!iOtL - B4l8duWՆNf;3M?8yM^Jǒ ^0.ǩ۠܌c,k;r]v`Su׋9B?|VL^\m&e8mS/_e-Z;&Y:fɷ KTֳ$doQ|hPOVѐő2]! c+D1}UOh<V.\;2IT`xͥyZΆb>gYP wwzQ>smtd', ?6g{d2Mx{rTl#7gD#[#n{`V+c1 և_H2n+y0] WV[5r-S m]9(xx2ͫ\F2ID=VP",L.:3^SQAN f!A>d>,Y?!b3jBHJXBdRo g};1<(Q1M1)+LيqDRf#95/sq|^XߞpB[*sT#̈́ )ō`D(rnf?wLf%g@?6ﮧbɳva?Bldy$-&4sҦ'C[G}{x> 8ULK&bE$˘H)Qb"JM)"I$t|,+|m$;A] d~C iNRm6Xf,m`T~yACԹq=;]J)DUo|z}tW^Ew=đVX cA$1'n$bDL`ja=TTDn(ՖVɎ zz7CiH2yFX Mq1UJQQT2w`ZMSj/@yZoMbƾyVݨ%@!F)(5u:Y+,픲nQt5TV{1[jKXlM~RU*Ǒ/S]cVC}}+0bw:y(aƵU|[ _EI1C&@G0P%)*Vo<`/"Ox&\>3 L>b\PĴO^؄?MWR| "iw(?OevjH?V>KWq@Љ L.#03)NxdF#yj4(NZƂ!eo.V_jj5x2q&ӆ+T89{nf~ec,Pqiws@xk;Va6B@T+?g_ ""a#]ZxQ˜xq8+5:J '=zPD9 ݎXB1!(YC4I|#,*D5Q8eFDW5i- ԺIqdD* GTQjSHR@I~<h5fI(AGS0K+i,&i*JX tNIEkzvF3WOOoWъ!r +`FqDBICbB$B>ƳШϣr~N; /tu0+$ yZ [",>AMTkױ$< #yDZ|/r@cte lR3XEt_UD`޴(jhM2.7|s #[8(ӆQҡJ[ݏYv;.u I, I>J+ ]ft{zc֌.9Qoq7L+wfiͩ{+d?R{w[WyaaDVԢaH7R`*#sT,2al E< k}EM:,Αj"p!TSC,Ư[JrKDjjM`U2Hq KSWV=+>/˕iO9DʝyqubHMQ(^X}$i̺ۃR?#x_I̪?N&z6wTEDOmTVikwKEmkRA:}PZWe5 u,ջQeJxuzMQoUxLmOJq$NgW-9Sp#k|W)f(~R/M #k+Í}ZV+ILFRJV$! ,8 "\q$e;,U9~}n=1y ޗgm`3IԈ%DV(DT )#j Æ2 M%gۃHF[oM*KJ^P|Yx:^.{k5)3LER8"2N#$N 85Tژ: cO͑>o0K5S<^ƪp# " Fd磈RVrD "f1GIʤ,a,A#Lc0#db/e6<=޺rnJt [_֭@(,|-ˏ6Ϻ^-\Es}B/"b0_|2_,ה|+nX8)zWIk*0;ܟ% *Jo"ox -G| C~0EhXz̆ջ/H%x0y{?fI,^}F$0 g%Jy \Isr&Ud#Ȱɺ1x4Ku,/.3K᥻ϷocpϺpF*jB:KT=9KB$\{HTw"Uҧ F@Ε,`/of;ii~y]FsׯGMhyt7>;"şo70'hY)a% fM}~tjyՃN$9L"'Դu"4^P*HusۅJRYfu M5~1AgL ambW2:țg9N1:/gx ׋j`(Z- BBMoM*zAv୭ 5oM?]β*֤4ꆺߢEiS\9v+'f^f^/Ij TQNnBr%H®y~tAF6Mj]/u-W8NAdR`R@Co' Ãyí[gp7ޢZeTl%SZtMtr(4W} Qhv<ʼn8uz4(8QptM& M6etu^~ضˑT08~X`D mWD& I%eBpkc `~x=pqTX#IL~zrnkي\Om[4 /?~oȥ'm"y{q<^O"$'wB ?Y%FZcY$zP2^y{KΛ*OOfŠ(ʍ+W|ګz{Nu5/9P% :Q0{9 qyVEhwmK~ 6X}\g6 &ٴ-Ddo5%ԅVMRGmUuuuZQl%,#V;aIcEbn2Jqf$qRD"#bπ*4&vޮ,ێ/yaw,*!2ýbxF!eXfX,VadS)e&R m.O E|/ytAt0‡6 i8I+JyiBUb)Ill1IdF +EKw4PjW+ZHހRk]'0TiFUj3] &p qB&`L,"~Am`1ր1#Q==w1'k'vagB" vEJSE(FNl5 lG%JnM#4HK:Z+Ya-މzXB#H*r/o^|ɭkGT&4b"Q։HTZtrKRR+tK!:z}[z3'2Kb `jc*Aӯo~yOIŽs^ۧ,Ļ\ml{ .`xw8W%#xQ!Q^hX>[3{A;E[a 0iIZ##5c,nmXI➉\5a '1xf kR.I~?2;047 G5o,avp$\_+B.|GvPp9 :!_4(5(..' { u4a!LXwwy wߪ)p=<^X7۩;j2tK];pK0|m(m^{iKGt0Pݭx޹;W$#zõ^zuj ޞ3DvRg$شtbdYrNVYj7 ?\@./ArM{dQDС2ic<Q)Y XxZxC&UTYIDsZA"ӂIDQDJF6J% q]\k3NX;]U5^㭰KAזzPE*72\+ki_#ƽ^ib[Ey v{ FnƏP_YIJo%Vj)Z\DdaX#.(9v GtBǨz$>ޛv "ڐ\DdR2P-=Ɖv GtBǨNyDg7~nmHW.e 3 iWūl{eWRTJ(tW6yUx}B;ףOOGM!JƉ=(Ŵ≎uhb-M#[V0*FReB21h%"ɹeR$\ŭFJPm$Ii*01 *xW "{.-"]K@1f!誁_ @l?-wWasؐ9FQԯWA4)(OFOnG0gW\8xeYų˻wK oӤhƚI%`0U+g Zi9dei 5ۼ`*DS\* (@"BX)XMn bЀst>8CFw(G2]h/+4c^FE9J !y!>wQbaĢe_?dE&Jɹ%n<3jZۯ4cx:Bv Rwr+X\gy6,^Wegc!b`,ܲM*Tc9u>|>5v>C:;Ks٪3>;COz rGg- N(]ߨ-ʲ:6uO\I Wޮ{DpoX"5SתUy+Zg0!}/)eLCL{iTTjB;ԃG ys@sw0XK3p8Ckm`.¬]AX %$oN!-~]Bp 7ٴSZg=iZhg(CBGLO^*͠obXM )՞5[aZ"kI\]0"KK¸(fiX*ثG[JݟB(;1Z,ߪ'@]zE0=,Jޡiy`?WԿ/j՟ k5C]`Чыt>FrϿCяiHDjVoJ և |}2iDp=0&;o#MS)6aԩSI  mM(:;9Rh wԼlE, #thƍ̺eXڦ#duY$R=l#0 FJ6r0'Kn-J뇢eΙ c ̦6YvX1yZkAշ"Z ޱG~<,6(1Agv`O §N䉦XhJU6]WHTQ>?:@vT#}+7'GϦ+^%qFD,b&QbeuW)Fzv| E)gj8f͡FQ5,=fH2\}΄CɾKTP`ExȑFhm^cT7K;c gL<ljv.LQgg^;3a=^9y׸ ćc[8?_+;\#Ui^Œng Jy;H-d9"l_xpf). .P;PqvɁ b񣥫M|yo * XW\^G^IѥֿXs;i/.KyU 6pԬDZU%%u\ZV9 u9-#~mb/h'ŧ酾$ /gX+L2a>l1nV'u%2:r"@'&4 Ak__^3hԙ Juw _*odd ɓu~p\4*D =^La?Fm0:u"wD,r7$,"X#TViqJ5H"xNxBbHF I2ēDdG:zߠoF? D1j0 fjʗCls 9b4obQ~oK~I;y mQʷw6_8ǻ>|_⻕ŮzdO{dA'=}0$~.C{XrO}=ݚbUOXȄo:5&U֤EJ(G Pl-KcAh1x,L5pRBC+5(5P Z!FRYED $I)B\Lc"k/xFVs9<?ԃ5vxס 9 .T}l =n܇U\7>AVQbČ#!EѐVa-bC@ڇdd G˙MrqZ}?-ۛL?|`|hKf[ioE6(.ֿ/~tdj4Ix| Y ಂ~̺f;YJԴרA޿*]4VQhY'141%ŵJ'7_3.9 9G18Jʔ,vHڄlw\ RbD Zhocj=T|VR TLxo\I>kBƾV a{G!xv: )׊Ut3SXݎ 碖R ے{J=%`U>t9-wxXTf- 5ީ1[צqV8: 4G<YK.)r5F0+_n]$Wx.+Lc@Gf6—wm7 |xfՇTyS ?&;-jt8Ud[3ᛕBB/ǹ5 SqR;3{7u\ެ|X.,9_{~ 93}?P"q|s/_73ZB Y!Ƚ% \9HciΩ,5:א4M<1΀e44:}/OuM 3[>뼭[b?G%.cnduLqq<2i&,HpYP .ө<8B^3ѹ'9HevAN[wNi1FaFE1]K6H=ty!i^8ǩax/ "`-!LNcWw:GefβM^L=Jޣ[zCW҅K@"[ T'ٜ)gYLU[WфPTNxuģ]gˇBY1Y?)~+YIB Uhp9U(C83n .\{l/6f;! 'E(%{[N/6ϿWta]!?]vq4(&\0љB60$1LT.c#b,WcI )rڛ̦ODbܛf<g gyi1%THO<8n9_>2!?2qG (vKA9ۃo~=v6D LIc& xI)HYTQd4e!a$)v{sdABt[ea`cVHOn70 ~xQ?-BU4ط0E$92ݒV%i2K J3eD`lD`$Qل 'i2kGVoP} J*%&ǑQeI1:i6dLqfcj^$VG% @ X(Cc'0Kt1s9yeɕnonP±JLͲM: *K,aJm?ǬwdaPrͧ>>9inh: R5|s2\ 7Ut'*r]G|hǭAk& X޻E|d]̻ ѿZ> Sa)z&`MUjԪ`EƇ VlHli*:Q|WQr1Q Lኌyuj  /ԉ>篼0 S2Ӯq+@S6:+q>bk 44Rl_\AA5[%;V%Bdu)w "ETl)VirNaBSFt 8Z܃y.kN7`F,`mwx"GmQQG%`IeEr4@BINTS,X#Bcn`Ikmb;g}6Ֆ9.ƱԚKjb+n}@" 5U)RHX_J鋟{I- tXH$*UBQ )LR\HW :^suc[(UT'u.)κwukBC*SB^/o`u -*:|3T۶njݚА?Stj]%kUP ";O$gەΞ•\!RPN IvH }294N+ ]P~;5 ڝ]?J3@gs?^VmK'!Cnݠ.>|_eTfx {8 Rc%ݷZBg9G@s-ƒ{j)},p#A6S/t6s5!0 7FM`'PU(1GBbB3kjItX3Gc*D3q2y|E#n3j82@h5'8Um=+Qz$)X9$ NbH5#R5I8Vc \5um{U{ aq$~5a$c#X,LK#NX3^UWH֝DʧzWNqk'ӸyH7W?6NRgw bX67UQ*:Ye;fƔ*OV$Θٻ6rdW,-~1;XY$dv_hٱYeS֝*V$[]*mF$cX#\)S?D`5o7ԂkpvZy"8BZ% gR,g,*81V.‘HZQT@UAh'+w|O,|Ci=^8 !"ʙ|JxWF:_b!LҢ=ƧA f|[`L &q(S.F-O44LT;S 7L)p4. SXBW?=-D!Gy^O FП|#ҪܻΉΉ!% c:aA׹Jh%eDeҳsQyAoEU2bPt0yoyC4Gy* Gz,)5AƦqpR&V1%Bs,CX nLg,v5e`5\Bť7,ȠKoCg ^ˀ@·ˀwʊ?"N噊Oj=()ЅIu3%sX*"i2]H +c{s 0a!^8.s/N aO!LI)BH:B.P^ C*<^6EI" D2Fsg:󎣛yYȄ2Bn梣t$hq뿙bnN B raPLs 26^<AEԀ,$EMwKk@Nk)4b)T`ڷJDaJG6q t0'*!F$+&B3S%rbK$1e. AvJ7UG*D5DzE{laxW0.qtv|iy{:䂩gE\đ^qUGP"\Z8JiLg19RWGLG)p"X0#8ڃr *hGmtNvN$L֊8H$oZZXm{š80aDX1ddPpG7,P7u]^T9p$Z%)g̋ DM q@O]yemtx22=x^|Ga|kRris7?ϏhOxrV~e^kj l`OY?~m~=Eߟf͢D/j_k8E,8Goy~€^'`QZ3kdl>  VU4laJ5ޗ>ݒPr*$i _q>IKkj2w%}77 zp ЪaMg_)WB9v?XDKe!,7u]c[gE*aM6r.e<58q>H;C5Dx{|rDrEW1Qc."\$嚒cHȀ/06jjt9X:H%{N*2UQn%\-GIAHKtrA}f;'Щ]17u!m{^Zb?Nʑr E) H:'0>bXyFQև9b\`:#-i:]#>u(yNʡmG']E!T~4Z>V1\ &pY)FhPYkF5qYD_55. j^$QMڠRpݐJ.붒Q}Y@"j☏uR#WT5/!{pglu.>u~z9~mD4P<_Z)Ewe# [KLs 3Q#d3uEjd_ҪPٴ*.)CJUq Ɵ! ZaW=?H)¬ שB_ +{ؤ180I4|L8)xH6HL B҄TM,BC%/}jD>Mn ~ZFIH/"y=˾`a[w h2hen/# ';\ߦhtxZ.n~v 5&r?|V./o Q<,`E1k0,rOf-&o+\x??f>}`5@BvH5Vj4VQb>']$6+a?r Xi TхmjVd󪓷WmFr{- %E Cd+p#w*}lx47aZK@ϳQ4˖Gp--" L+K c'8qU2%ii2_i`hw|.k/o{eo s^IgBًka6&^\, 2[:׌^Xb6BTy?f}i54^-'h%{5}zxl>y 7w8R-Ҕokh(r$»'IS^dPJP!t|v?p{qPn-㢁VB" ɞC"5CFP o=ܞ#r>r`|p\'hg,(,;DR*cFI10X&4"&[^D6B?pSps̄"KTg&HUZBT_!h)Q{-)΅cE;^ŴvB"\!NH^VkPxU:xӡy]1vo%rDi"YVV v>^K<5iä!덬}IHoOG;$"x s⬗dohZ ,uBuXpZ,}Hx٣,-e\^%L㍘!dO)XVB!U?(R1S_$ o4!;ߛXX(ߘ'& ˞11CXj׌eo&~1Sȓ(B7H^ᾡ,opTK̂r̊%]Ͱ$^rWT[ x%/+†hWijk?{%SSh/=Ia\ș2MQ7LĴd熌 Y#kN79_J A̘_zɇ1H8SDg|M7$Ml4oIx;9/o ^\PKХj TKȏUV\TOZ¿p~vdwfF_h|X.,O:Z+9a<m2|뇃$lO2ȿ翽G<^BJ5<<A0Ҋ Y+ ܨiC5Oo^1^m5l]}[t pfk~vh tdTR"w-9Pd#\M֭t]8KɥFE}ЩbE㶷 :ݺuF_PYoHuvƭ搎sjV,bZ H M ר>& {G"BÎbuUB9D~UI`dެN(q:݉hAyݪ&qR5LGc2tZ$86xZkvҕPHZjk%T;5 d=vuIMBd]6 y4n&yB3t]p Śbbf%1@8#h*,T9T [4KAG g]ՓU䙝 ,={4ViNRː$e1( 60d!iTUa! \s;}rbi'8ýH]y*Rs8 ZBz*űS=Ge0s~>OwndĄ |M82Ɩ I, GE{z"1L#d)&bh$8VΦtKĈa5\!l9#%:&SíqҀߖ: &SXGXxJS!uTE P>뜻e3G`lh$xU2Qmb\ KYB28  %XϢ8^~^QwF P\ Ye[ϯ Wyy?>^DDh:'}~wc|6_q9+LS|f& O{wȖk:Ӄ۟0xӗa]ko6+,v~ E.mwMa0t&:vj3.زc%`0ǺC0S4zWk>T4 =[FG|݉v]g$^zOr<1BĈ!6-/XzPʐ) "uwLԋSKΐWZpz[re!@(X;GBa$B#7ן Q1SyB!c\(c=k@C%)v~T8ԳFH]"6׍B<7clxe!ֵxȂ$*D]r(=\L޶ .UwIe.!mLЫQNBEՊ,3Zthy$0mfvb Dgetd$7wN|-'TWlZ> PJ3GHGU餩ԨWډ:(bW"v1Z $LtU2E.BlKHs^fti00혧gHԜW"QY )bXLy*4$gSTQ'IjȖ)d~7 F_9+IǫAC!戬QH,N4rS-ID}X/Z4m}z4e &R] ؉TY)͉U_-,]8D2eG-WjxlYjVhhQߡrȾVHoIRpV($m[. ^'8.u9٦Y+NvAɆzq-jSUX+$92ktb}9bF7xkPyA1 9`'$-jUZUP?sx^Ga ō#sw#a`76Gfevf0lG%GlkgS9P(=(,K3),B9R电Bk]7~ ~yFy`oye4=^[ğb2]-`$+sLѦaE^,nϻ%׃}k1Q2յ|,U_ӟ_APԱu5{gqTzT&N5Y|2HQ%1ғwd+Y?"nq>etV,QDKf$>dXLj夏Vf 9&Fj&JsAJck~KFӀxiFة5ti}:]\o[8Ƶ2ԇD%5Dz]q1o.+qT_R1#p<7魥*#HD iy&j~fc f v~5";Zl_W2O40i~*MR@׳>Lg(K407*^14ꌀE׃ot7Q2_Mwiw~n`)r2NLH)-FYs1I2yTu GPWOH}1JC`(B$ +"hDb!p,o瑺6FˋjZb;%6j(nřş@BbLjB P 1#$$ DW@, — YA Rr.0D 8">cVp;.܅-Y_mXBjOl:!o1sṣ佹qޔ uܟFE\R_- R{dWͯzv=Vq`HsqVktnPq(;N+C&KӰ $]_ bs4J\S kc x߬9m `3|( v(>PEq$H8&M6Ͳ[c>79ݯROiIӠ+b2 th|601SE_z zqh ith`l7 ZFZqbf6 P-VHh:ѸVZၮitSX}6-!\jwd)]L޿)5oolN̟_A'Qz8}VM\`.C|a"We 1_~o&WH7 hEGk >OppM0; ]GsRLegӜؔl@QOϏߙۆ;JwQFnI?E`Yɚ K|RIܴ?yjS0}W7vwŠHӱ}GޭqBU n.Zȟj=$ϗeN˻prqR!F맆&eGHmMYD%|ӻ_;PbO)1)wwwE+>OLƴ~NT>'YAE'*L6U#9 O3zU?<Oɧٓ-R~; j!?n+s+K'LĤD{cI =BCqi}ꅾajYGjd[HUdªJ@YlKoY'Re6p"t11># I"H>6=Udj%FK* k&,2f4Qți9/y;.SέnXqKdAĕxndaYۛ 1V1x6V}R K+_̈@4G\(;.w)daִU=PW=7Ҥl; ] ˛ Zu KaW ='8TD9HO)na>#8ޡ̌vz-]],QQkqicD ☑~鷲uT[[rүH~:]Yh^lnZ/W➝.X@V1Pkf(u0J3F*#y(7LgH`*q#r?.~E# Rء24Iš_Hp#)ԡx^| P/FPH~YNFcv|#GzϏφ4Of99|jIp~u"P›%9i\rt09I#v(? OAAG'y h9{J‰~2P-vGjR4d_*w\>e|a3)x܋=h|B_;J`rW&Oh9* u;6-[.u Yރ/tiɤNuC(M{Ċ1XseGy޽#(7mF 7i.wDBnLʚnWDn<lO2|kMLs[kn%8n,rGM^ncz2z{f5n5U%I6`$Su O|5ϱ1ϱ1ϱ1q<' 1BJ QD#sb!@3*ԟ?MmR^&px'WE׃um^"fNI|Ti'+7q!`S$G[*]mVE"۬d\3JYHCǜ0VKJȁD SBF FPE۬)wE߶mP2#X141RhVPA}c)`Q)$RvBL/[ p cζAЎ̘3mƇw .(cЎ Hn @A0~ ぇ"dIVnYRPK{dퟢo~/ ~t\+ x$ȟ{"MG/(Dw_yCƼ?̞:.E(t ^MF;qmv AȍX%£_W:4VKŒipQ?÷70d}wÿ퇻_J_i9(nT3gu3Zy8JU޵6#"epv/0Ù4:3/t:q {~Hɱ$M7zX*b>mEKQA}=3%HлZS<5W&S=闾XA5A^LFS!8&8d$H8Y . Bu?"e#EX;*L#ziX?$3A3FPg}wAo _ bصFJw+ ىnT11 *msȊ@I0jr_ m.YN3>Ż,( @k05xr"-*/h^֫Z.*nZTgLAmUYz&8l nj_< -XKg3B57EOڡyʲۛ?,PnT8˞n59,`EnvLy+r/qHـ,J.^u٤of;4eXhLb=s 88NJ0IŸdɍp7:~VL3ff5\*wPhD-Ý c5v%V3Ț(ʉ Hh>UmGZ3/ ){q?EZKۀ.9:=\K1isW/Ao>B q8`D<..4#1@wxSl/Ͳ/K%\zlyX-ω_g+IWh;Xtq1^ѧ|qOnZo׋j7k5>پs72WZ}wЃ.(\샞ݨ~/iH'潤s?\AA(=G'".zIjm:'k h 5vN^!AG7!`j MaW ٮ?x{t{֭ϜǻqnܠvDA3F{?罟Fx>oO6ޱU['x7h(sq^+O}-=LK/Dm!J8o!ƱA_I h"B ℝ ., Es" | >C ^`e~fa%c=ꆤ1(vLݐl/ ,}19^F(WyCfړZhg ނ WP#d9}:^Ħ;c~4 G3x t1NPID_. Ɣ0$$KUXJ0 BJQi?<9!xtTG lt<1SFO[6I!C?3ȞTN`X?<>-UMִ0,ecnQqQf1ȴr778/xꌶ-"F)VetS-pWE~Cdd Ýj}(Z Tbcd'}v"po~Uϗ$)22 1I4)cYQU蠊bGZVQ@[y@{ߊn׬RO2U f=Z?~-s">ߝg\? u]xYEK C}*k'XN͊[V!ejUQ-vt}KmOY4 mzdJ٩Am|vEmx;׻%#ֶf^rN^v$A=_<-T})[C=[5ĸ]TU%َ!BsWil^{a O jѪڹ6ĥEevJ!C+oī e7, u:bɾ\gC$>0.YXP, z1*bרwvZ\‡WN!z~,u ݏεēh<@WC*3F* hzIP yBI(f3'DOÑRtfH5*n5CJ;i7t©-fҋQR0aBN0 `Hb%FUD@(Xr J*ة;GS|k^CC`QxF"a:HVbr~k*G|xU;M2" _ 6x\^N0Қ| zAw#.gevu?Xy5tZcOw'QgO7} `5kBƖ' i7D_"-#\U} ę]_t|}}Z^q؏so߾!&w?U7n\fE=O[nx{{963wzORwаuzЅIDH47q['x7PJY'f7R5ȭEN$F˰o{2 ݔ*(,5R:0 N?M.Yy4DiR"U𼂞P7z78.prK0J½&xtC="Xu}]IM'=޺uVwT2[Ɂ\U\Sѣ<0+{zCI/P{p:N3 GtAYWx$uOIV>!t(`@h|t$@NW(d PGNvĨGM>r{p_V:pI^Yu LsQ/%1MzU̜jyAӱvI3%[ P,_XhI˧iZx<ʡ A-[-p:[`^V'i3e/77 RǢ D1 8@[ ^ńVTMA&u:^(ψLDR=w!"GL"Д+ =!LG61njr27]~W{Bَlf̹E (.9t7*1?ӼASWmK顕-`FփS.b.GmS8!j#8* Z*'l9v|ZHVQ zP1'!Wq޵#"epfO˼_ a`{ A[:%|fC-J6fN6 J'Zic He,FbyBϩ"L0f8B% 1d)3s"g9yJ)9@XEzOڢiiNR| 8h|УK{齨d9U~_|V 3 og9E K"SZ cك<Γ̰^S”*CH9F5QQH@Ir@ ~42f 3%lARg%n-Kz!8M;b!=fZv7Mn~\7hA%V/gU<<5hG}G^m/8J+|z})xhaГW1@Px4:`GS7B^Ʃ"M;-?AI}LқWOnʁ]9 *UMYZ5VU 񤒺AA'h,ST=}fi R%3Up;z P-Sž&Z{RI=uAUAh@/=( 1WL9zns]b nsiD7zo^M-ךr.ΉX _{lago`Wb[[r`:@ P(./97[b /~o͸9lpLeFףLnחx0uE/abTm5.߽il pkf 4O77~C{2hA-yy^0?dOL_peWrW5*>hbi;W ?5>eOs[2MOzovT QD |:,)ہs[!_2 -. xdɐDf31erѮ?~v'dE.0EVqwc =' !.0;^^' AQTtUht RR#1F5Vy9CXrlevC' nᱽ %7 Z$wflefV7F1ޖ] tX;t]t'h!Oކ]̱ %d _lPP񺔚:]],^zs'h;IK]2b}=S}׭HPdE΃|HN` E TouƉ \w  tڐ 2> BEȺ08!(l5),AH[ۏ[R ؚN1{A7v8|c2*%)g DUPDfдO%"$XBLPxq:Yq8 u&)by& sPLLyˆgXT))+G$4RTH|c1_|cU˗s+ms9Q/qv=N+9iZ钀 MSL$ig33%Qy8$͕A1"$RG!)KʩH~sV̉ RC ^l$]%Rn8Z"Wmm$i ĤRLQu(N4#Aș)C dD*a0!8gxˌ* YJA0*='_2-L4.6[2K.}ݷ_yE-~W\O~{|^ opMHtDޡjh|fξ9gK}x}~ϛ=eH>{ E {_,yxg;q/ՑGoˋnqS69{% Ef7S1ED]8J?YIIt")j*vWR:\\ikInFYVz 韛{o%x=W3].+4moǍG[wIӖ[0/5o Y$Mg%ǓE\0e;pyb.n;"Oʫx<^Q rm"(s՛pg8RkO}R>6piSOePmvQdV<-#_Tv}~A BIkۑ,zu&Ddmhy޷|2!ns gSbdJf % f8R\(e<\X(ULIdl^W`Ѭo_jjatF$Sv aLvq WLh9`8\rL8[ym66i) O(g)Hy$qcb6q'%B7I ŜC,Ƒ#@q܅J\BҔ4bSH8 G"!œ b22an"{Oʔ6H,]չiGRxNBAM׹^8(˚yi`ٵ[K/ǫ3T+9^E6izqzضf1 /E6 oY R9d98#*JM:K"Ĭ˩(X:e wd67i!2sG fEpWOp }*hbԸ29Qv y6yKl ~~V].*u(ł IQ#f{JD{:|(38V#GL>jXr6`.*stbfyEYzyus9)kػ6IC]g4Bos}2͕m1P3pg|g; Q!NuDq ߃] (hÎӚ(qY<[T%- (. nnuA,q.rHPQVw`I]6 0 VK R%`1XP}h!.BkbܕMd<_; Q"e=)(n CҥPQ6ꔂDdWBV8qt;6(qBBXzAG .L A #ZA C!X񐣫= =DHU'|(n琇)m3ӡmo6 0W)ݦ.Vh|eʘKoQ3)0  wmH_QPx?\[jW}8o F# I9NƐ/ C2Wb~!.#JкޛBI(u.MrP@a=#RX@*p^H%s6Ad fz%po+4 ><+0X˥$.\mz0ӑ1VrIf"$F ~TPBC4(.dNuJc$XAjGjTkukt>OB;d13e}oV^a,hG?~Hߓ֟*W򣟮ZADdf/9OoGVf 77.F)?;wn|^xߙ<&on)>~/ߘ'T`GPZ,҇S*i6/y)yX)}e6i%"mSc9QߏLwՠJ^:pU Ҩ)LWlT ,=faFexuCd0}GO[4"oZ"9iVv[\8D ڲ3ٸ!2Z2{800DF+ޞD"!⧅rXMECS-Yrz qy Qzc.~$勩¦ۆ>xG4jԔy,Cc\=-ŭY5[IP<9;3cEb Wdi(l gz`A?9ч*Pm#Y io\<Ҧ-hZ4kj"rPm0 TP6o0NⴞENӍ]loB\D<֚)YtJC@fqan ot{jRV/BbӍ% *a<ro-B4 o?G~>K?("pHKRmZ>ilC4׋o%.e42y&+W)߫P.DI0KR*If1& D;  MƻJ2/fsM *\@w1gu]6*8\u=J[vEvy2T~( ,v66Fuƈ-Zkh&g`MW؂kK a [kr`KK[N}< L 󊁈ZG@ c!NJK%9JeI 9wqM ^ 2Lk-9f8̞+ /NHjq.߾ZѱPR=pgJcI/8\ _ܧQM{':T>l,J)J>S${Ղ'zl(䉤JJHJ>hBZ2j r~}@#LRTG 4D OJB-zk} :o=Xb L:3r["DgM5{sNrɔ]Ln|zUzSmF#2j!J)% u)M!(4KfP. __aHمZ \ lv65:$G2Irk tuCeY LeTOgl;[(k4Fb4w˲\tXW~vӇ|y+p.x#^1 )8R~LΥu [,N < -/YKTX)ITmuw͝,R͗0fQea*Sɳ 㛈Z YpƬfH{6MDNV7c޵OPi~Gv\8;';ؗ;^뗞}6^P~(j72;:u2}a\m_=E>wҭ] '壕' JPיJT1R:X,ɦ$]=О0ָ寏,IĚ=gOX<^0^{*E (iyL:z8e "rY;n4^)(ST!Rǥg8Qay?|ļ& T 'SӴǨ1d~;ˊWv-rbNi0-/`­-_&^zE-ДI٢`~m&MI6jK{G8y q b 8 lsCs aDZ))}@I-Sߒ)r -/ F "YqH0WDVpgENq+`ūBrz VZ|,N2X6"h1^H`&fX<'𼷌P Wsc<)cq!AcuT۲)hQ}IВ&hm<w\TñDLԁp ng"RzrЫ0#ԃ( ̉]Zot$;A47Ap&vX0hŵcsNrW6B'Hӌi$1&LG2-c/A*3+$8=w"`KUګ)6`;3VoBВ%hq13-T嚭A+U:5跓qKS1kQP9?@ Y9VR;VӫW –U騇5^\GB_0]$B ?^ң{/hgx-D4IƲ 凝UA+-E#_te[t`Z%ƛwjƜoVwq2rf͸~?͟2ԩԥnI+wt :"'I0UWW.04r@r:GV*ZtjC.*ډzCk&TGtatFS rY lE'ëC<hd\LM˘ }yK50(0  L ;}pR햛LEOXP,x gq)VHt0E qQQx oYIې>A'>Eyݟ?<ǯ]6β|Mg̾&AzL^9u)>X3hF,AYʋ SócⅥ3[:QSu9r(/}pqN[N Q4zq[XN*d]Naz}{I5 5l{% AS%03U+F8[J[^vǀ."Ѓp1{_(fC37/DQ`.tr)qD}6 XSaa"|Y :riU8z䢌"Fʻ<~eqzD`jc*[n;~\_~ɑzms9r 3ܩI)FBaK(4ᔧ8V^&0LDhu@%zEXYpNN#*$W&&NۜrY$,8&hlC™: UL V[|g/g}`oc@?zaA /r-4+o߮ڽ&C<}~CnY}~Ah. ~vF#|1B0 -?4fKt~w%g_1SbCa a`lnϻJF{I A :Y*+kϣJ6SC1 mxrK 5XdW(*o~t'v:ˌ\#~y. B f0G-+C_, i+`Z(B& AQ:*1" 7+Xᓡl{6Gu)jxVFnj.|6ls`ͅҚEODדNznkVV`!ra7:;X~cC4FN'Ya2myRLY߳?5ak'Fդ0I3}SM#3B&G \qc,"J( ,X?G;ڄyi\v_BW}PpW[=Fa[9P '|aɰ8-ʕ}zyh-JiBictIӰ6M J!4Iqpf4KS$62zڢ  '-$}[je^ϩZ]-@?$P$OעGXPҔ)3x"SRnWL*F%:.#sB5[ C4!%Ih"(AKϮ%ユd-b? =FRHLdIsh-AlZɑdd=:$LR 󥮖*hQ s9@&7r3eP\T=vLYUc;1D66)*H*֋ڽkS*g~^r%wЗCh8x %4aśkxi|QMR\[:òӛN++`]3vNͷ7uG`j+au8mo"Ewp"\~3'h`8>v_tSPl$%Pd#Xs1Fыl!-\.a^$0C` 5AM3֙CSp?Tʓ87gklJlqx\К<赐9rîD Q;Rir aK8GfmzդҊY9|ccSS,^iWW[:"RAdeIMq6GwlM!*qmG(P)Ojȁ-^'||FFiBk).HB2=o B"0{!`+aCֽy CqW bt:"~h+ud^ثE!ucY?@AOgXvw<82Zñ {=ҁ~ TBEK#pP^ё/EկF1v!/ۄ#r8_7akQIPo-` Wy[-Dt劈cɻ#Pް_I{G؇AJCcا[2Yi 뢤A섏"n~ΏU=z zU%=zDUGUp_-?ċBύTC$Ez)fѽ]KR ܧ>Nܢ2 @(cU1LGEeIjL1lh(f0m1c``N aĊ+ .F h!u0M-`怾|~pR3Q쩻ygn;<6|3*)=/V^[Ʊ\kP87X=ldVyeڂM*簜dG“?-"""uy2p\?i曩o[s~,rC/!vկG0ؙyPm=$>!r'&;0%srlX,RPlax )D͊JDFt*^dN50ɛc68hU&_9?}(/Ff `ȧ7%k&|:' 6&ܦ)NZ+bPbcj@]€X$^VZiO=@+UCpU>IMZ2F:F`ej"g0Tqp-.M#F2e[*\bEOw[vΔ@t2qx8! } Bu$ ś~"AOY(!C\P>C?<4DJE=HcƨK+[( AƐGJHyĆ@` i?mfQ ^H-]Lj]>?eQb0[Eդ  f,2YfA$TSJ1E5n.{\A>]SIuĪ f'\w__%58)PP]a k (e2eE#$P!mn X]wٮm8hA :XC1G[1f#&HkJEņWgtU8޳  :&K:2P9F:$xi =tK;[ȸfB!,eutdH"bb)bqŃI6@?3x=d3a0򓰝_hO?#I[D|Ծ[鬭ɢkBx[[m{.Y7CWnDkYc0RBwDKoFH|ŠfUe_QiztC43\Uf Z`]8 Lo&q_;;[~6]UpZlYۿkIR= |}Ha=[^/JmoWo&u¨rKiu%a͟`~}?/A~l:G~4s'@/ rѡRT( /JYЇ;g_ ye`{\.k\ 2d1.].e\ʠA hR ѥaH6aH*,K`kQ5xN7qO۟_oAbM<fѝ$9q.0(z8=9U."R!B^'~{L/ǣrY?s8=d G0Pf:FR:"N/T{f=wxq޿'BBuQ"xVRE%bs; QZ5Mg@ַ5=;e~!q~A"N=뫄R:3zI4ﳾY_ERPi%ǔ'}qm.HSs{Σx*0)q ၸJΆ Eّ~tn2ΞC<5ߟ8v~ɱ\qdWh[r31teϧD<;?nzw[et4"`<|&uc;IX3KN,qD"fxi~5Z̔'<]+ 1?af љݼra~F ׅRkDA3zQo]mR=1Z$^kTɹϑIS5 )zC``T¦J ,)pD$Q`&'%EGa9ð0Cp䧝GmLGB4a\cCAָՂB%o(~jT@> j/x20mx)d -UzƼyFG)*5:OvGɱnJY$)E"ϳvDPM$DZ&>Քqtdx^2TI^LGbȦ$T#Kގ pDw:pXf!hbO`4JtMSyjX Ő.\yQS!x<r̈́IBOg $CGW4JZ?ol9Vv=Tkfg B贯9 <¬IdҘI&ZG H̩ƪDvRq´AL"8룶J@hM@5.Q1׋WmVhf>q0~xQ3Uvybv(VFm.x iv-C 8+J,T [cX܅x@ms:_BېV6h%ޱer~knyG'MD0J"e%!nN_0sAxk_’m~Ql<6Hq/$$e 3Bp'Zj1wq똶[CKcpuI}v~[.dG1%z e 0#Hrij AaQ &ȋ(R-&plELHuRI54 om!(Mj)k7)iPƖFU."i;JPzPu&"~h9L!*jU:T@+hPP+%v"=Lagz8_Lw bx / }d,o5uH$CR$Mg^50 ZxQ/URĿ" xƻD뢳'(rcr2%(LTqc~d}GځgJCf~u{ʛ\K'`/KjG'u2T@%T\~U;s2ǵyxvkU_]v93Dg'!E7AHd=kzw0\2 [rkdƴ}+ θ/͛ӏuS|ggaa=lca F[؊. [J#\_l ;N[}Ca!>R/LH՛qK6 Yc3]%7l䓥RD*:LqS]C ((‘E?[UO%.;}XdƀVjwKj&122 &2&M[ѶH:z~ L}Kr+^n͙z:A_kc㫀Y8Tg|n] Uo[ y/NkpO/ DڳqԶj"G/%D٣%Nm [hnF*xu6/Ά%]{w_20@Xj*:pVꖗP)uh\ew%-MMmeݕ-fNtRLw[u< ip<"vc[C:iv iݽq2 ӺMdf*V@(M8 2O)7J2KTuguwP.^rlm~|jqwInE>r+agpc\m;{-!wV=m6=5/p>q=sjKV=P4Cщlqcb tm4b:Yy˫LMVAzFӨMglg ~֋ZdH bVXHe/_ /ėx>_|Xi/wT/zFH-a?2ٞYe`c_Dv@/ lY Clx; ,`ri?H~fu>BfBpoO|v67ÿi믇?qPП0UƋ''|ѲȠt =BJ@L%uWS@0T!bHyTUpχjli~ܤF)XCD \JV;kSY 9جѐέX2j6ryZ@ME H,`)"6Fi ;FDjL u)ߛÃx߽=Lc&[^. 5e)l1YC}pƱ"J>Zxk$2?dw9JkVМP\gf^׃N..ol//2Gnnf%eLᙏnw&ɳF']՝/N?N>r`[ȹ/p{ ]:|b,db6w}o9H.yIIp*J>D?p}u.<g.z*#$ٌzռ2IER&0L3Ԫ;7=ϼqv<#Wϲ93q(خ'^3s,1Gf*@ZX )ns $DLLKAT9#60+PNW9{:seo7p{qPNO>V+=r,XP:UU SPyc(1fz6=xb<~?aǘ[ ; oߝt\g)ĶZD $LZ:'@%KF0r-C%(DZGr/kuyxv.)j>T}2w;=y~x7[3p??ww*3olj>wy'gz]=|YߨNg'W[ q}z\u)*+24&k6jZ5M`M%ļ񆱩u[x}~cUWda%Yϟ,~x>vJ NN-m)260#cK%BV<%M9L1/3ggRrH(_0B=6`p#~0@MM;Zd-1̃a!`}tF0Ib-R/>b:30+Фdtc ȊgjpyAQ-U^ 6U&[L X#'YRREi:l}4kрzf18XxjD U0y" :55U`:tJp M`om5}Z7v`:B%Yvc$U11#՚^N9vO.ׯ?T߰Fo9J,Dͮ <5<_`Ufy$-PPɖ̃I2UD !lX)tXH[bs~?9. [Mkt>gt~=$1Q^rT_stSìܔ0chB1SR 䔰>ɪ~yW%e,I5副R)Co'qlG &Ti7$Foo ~j NU$_1ٸ-4)i˯.H_mC2iڙnUXq1c9qBIaf4]8;y/-tƀ"kmZ3}4+ǔ#5k/kw3^9of/y VR`>'zhBot[I]5:(jp?eȸ 2d[7}HIN0D/uZ*=RX"2"09q((Ucn֐Nkitd֪:K.۝C]s$#ɉVl%M< ʻxƗxk=/yx;tzKs+}umGXM6ׁ)I}ܶP6^ (v.C9'p[Vf'cjվmD?Θ=:y~_Ag5̈Z?Ժ=]^_q{?~ 䳬E$RӾWko.qJ 8nj[n+}TWo|}[Z^laLd;9;xwwnG''Q ,JDKc`!w:ow~g:?D8y;VX+vFX-M}O\* Qz*Hxfg̲2Oe-`CkU2yuTF^6:]K%Mb_vNjcˎ_zoڱ# ji s^,'T.x6sv_&C*1B˭/)G Vz;y\qǕwg7P8bWaՉT@&1Qr)͜Y&]buFvҺZ+PZO^0;c nn5ۓ0jQ93so e4, EhdʐE8 bRfrAxڅK&&Æ,y8x][sF+,=mvt/îlv2vIa=ꫣn!){S{$FT p;sV( n'RJ@;%%D]r¦3Y'x<(EiHq@ax8h:x r[hFH; A JFH~T1 [ZF%N4J. =\ 1ǾhS/[ q0h(grc$ 8HR!숡y02q:h\\`IH-2MJǽ71 nKBL2z0q<`Ax#mpJ" ExPԖL ؛XGנ5'DRS/e!!$!: ԁ)"& ]0$#΂ؕXvs8 d"Xi? 1B|6pՒvL[D >޾FsɔunDƨR<`ASɵdmL)-"HZL]r ot́nKvRi\O)kAaDr59RH -c5dºfiޢ }Yޔs!Q/K Sto ,Ґ;4f= y5}bj.(dk-HC4KfmZ4R. BMBY0om)AT Yiڗ 3uǘRPv$^bRX 叻U9Xb}%[ֺ-ኀ`bB]A)b]޶H0A>3{M0\1AY?;zBqnb])B9i66 S;`54diqTB6tK1i!\ԝݖ7\KB5f*隮mdFsqsNjN?fv}U 3xY%K*~<[VlƹYuL }W%:L5o]Z׻֙{Qka@c"ֳ?^^t=,=Gϸ-xϹG=fek4άU:lv nt0 Gw?dSl=IVga900\H?f1Hnsϲ jk_g^^n?dzX_g;Ƿ27peh=60;9~\l'KCeXBHفߖg&6k Z&x|X:rvyc࡯CJH'>|(-/8A$`B CS=sN0(?J>CW!0LM#W1^)(2#FXuidC.,f}$Hxv:LO/He*w3/+ΎH|SÅ'T x X &\"VGc;&W=ɿ۟2EQ{fj)u (_\\]6;ͻ͙ŵ[KA N/gZ}5rK͹ceC1N~/J<-ts_AH\UMB~l=VA>ucGQ_re%ŹsgNd Ǩ빔@tWLkR).}%-Q c6 ӐXxDD0QC&EPɔ8< 7 M#D9n^A*jlt7UgQٟBV}gt\FF8ziyջ߃1ˋCX5H=jT!FW>X4"oI6edK p^墅GL}@6(2jtj&_ /E1]G 8g JfyyK`V|[Fv˶;xQJ^ddBRiu(b/_ RcH#IW=Y:|N߭; $gCoKd֏'gqU_Ohc Wg7\Y_=9wJjrsG?a(`0C>g-NM. i!}k}~'gL.O}<ow ĭ'_YR2htϛ1RT>LM]E$#X\12S.Gؔb ';1*D.Lӟ1FA :$W7T¨H2)>e< jvڱ,0a}Xxݾ O fSGlz%{t́#29)iy3Rlbz1uR恹$&Ѭ5Qy0Ajqs ͅ4c[kHO^d/ͣ_+YS^l]x8Fds$|t:[^-<4ә_7%$iܻ~Ԝ׭|oav\]6?52^D6~l?ԬKl/aw+o*Q0}ꬋ {'Gp[]tJrK!? $ɨEQG/`.zlzRW1ETqU 125QM+4<$ֽ`Y妽)?:ȺU yN%A-ة!2pkD ږuSM <a3F*"HhTZıv`C%&""FG A@PTC@KV%ت[`m5H- ˺@)W۲AJ+RG o(GJT?ηYe b/AWctOƪ D-:K?)9Z5lu'(&j&,F)yF4ޕZjT`^m2q @%x kB*k`0pJ8ZBF+DM/iR[[LsP}Y\㞓?O5]3'Wҁl&XH6GE Si 0¨XK tq:h\\`IH X+z0 0 1h3@4q6 Nk[}"6^oisOis1$j6]5.>< ?Ɯz/=@+BS) Mg\ pBH B ZaY+8$<|4L ފ׾#|)p8B¿fiα!zGai("|u8v( Bhx$BPv,>pHjE,Xh 2ǃ k6[bh0I {."aRᝊΰd/A%*&^/ :V:I>=.oi0/2RcՁgf]Yh5{fsҐmMtba-&w\ߪVsH%Gkp2:,/]Q4x\YR t :.%IP,A5H;8W_@4c,|èmB5u`w[0mehL[Ț5]Vޅ9j_<`}5K7d?>b͡7ݻ+)io@َ"eϐ#]S,-Y _<8TSFԇQ3ր5QJjx7EuoM:<ɋ}<.>^nY}n~[ZPKi,7G涺HVG` (uY䘭r97b[r%ӁݗWHb~,x3B񄥬\X2[hf1%y#"d`|֌{ %SN-zVB! ~&kMlT`Hi٧nЦ<5l#pAևi u^Kqso@tlv&'m-f8#$]INt6ُ|\M'mHW~]GD -,=H-[)mHRjRxhCMȸ2§)&hęJHU<Szgl,3S`٢_z<<|J}.c/ȇ]1j9@uѝOf薚[ }鵘ϐbrfȜN!  p'Kԑ `g*#*Y 7hl}~W(ak'(rS(d V.=4J}Tq<2?4 Oщh5@%i09%Y7@ y"+WҎ֚׊k,wֆ:A6>ƓJ2!d8 ; F^Ҿ0N.peS>Rq&b]S3 ߐd@5#h [suy=3"Gl"ֳ#HP{GHTZ--x8bLv$SZh~\TQz j)>;^ݠVO:Kz_407WiL)GjaH,2;)7`z)cvGIΊ qw{Y*M{XNvBeXݣ*ؤ'BS%D[hh*Gyuښ#OzD '.\%&qbÂ%kH.C rdžeɴL1e |bs^P;(]57PnBr8MtSK|ݹɖp:{Z=?Np_hh C޲:yxcTl;9Y/V `R4&^b`] 0}ۅ#V{@H%ԖIj L\Lu kPA~hO J`Yn>ڻ: G8lMc"F*y*A:P-jܨ-Ʌ4ݳm mt<}P&uBA UN xš`tJ>ŤΒj >:.*KB#QUe9zD^(bJ;[p p `uEjSdTsZ#yONYR wv r2׭C6ܔntZFʬTo1PV!9!:lrdЇ Yd1IԨELh^:P ۧᔵԪQU*f\"\eˉv*c bD8;J܃Zjjݠ&b0 T9:~S3 )A+Ós6E(B`J ڒdFI%_{ѶJCefȒK( 'Րb%9s2̴ J`m6G+ *9s0llN!‚>'"mR0:i%/r̽ RЍR=x̷MEi e &/Jyjs-pr B|Ś$؁U)}^LU}I*88ٸ@6;Sو #6\n fS>mx=QZՄ6(<1ɇ7C7#XpŒaDJbFﺇ{tE8ց1:3+f>]=ғPs[,,t@Φ2JmRKߓʢ@e nUW>{7w?( 1ꈍ֟?>F)5^2 tM2,#텎xejZ=m8H7RƇ_)4mʹIaf]Nͫܿaᒾ@0T6qB9Bo ȯOxz DZ^&{/oĝ;p?!KҤIZB#ZǭV0 >$NW%{O}?s%1hXo-E]Q2um#߿e^5)=mЇ % j7^?t((%x~RwƢP=Иv{X n&=!|HL]vF 1nrsBtx[왐?6A$vAg1 ?wa8\AVA:5kھl*^*-ڝ ƅQ!݉G㑦=hěW-8=n׿65}#s\s\s\sܖ#m2$6a+Q!y1Z'CdP:HAG+,y]/5¨W 4<@/+tV~T\J:bN P 5[8D}Pֺ,1p9Xc1zLhDj:NQϸ&g9Zi}AXڰVCSuUeɗ8`꬯ER[كPtw ^Or^m^Mj6UsMõ630RmӉԓWttD%RM(KBEu&޶]nDTfТ- km p_$Fź s^`́%}M.xJ!k ,m̭kN_3JOU֬p*9乚 ܣMP_,Bm=:jJ)-SPk{̩mǁªI{ u#Bs|U,H@k/`{}/fZWv,+BK~'hd.. RIuxU:!Z aw]o$Nȼ܇Hp|?ۇdv0WAyZ"ՀZ:1"ǶIvGl .zNΞ~J9oO k30QҨ9{OeGw0lbblpq`de휹 tvhl^kga g-W9+4ѿ/S@vrPWYf;J.pۓحY_hrǸQzw|PRޛT;eJ 3:I&˰w8(c, Ɣ6'^?A' ت7v}8#!kUn>v=>!`%oKyFIfe_y/OS!_Ds5ն~n<7֋# p(`<hP*Oǘ 97 6\jLȪq`<%Ա al$cZ3G:|x2Mk,#_#F1 }CPJp$|jRޛXڊgNqc{cjiاbec)VS߻#kGR02{>izuqoB/?9!'A6!]/17LGrpg]v+g Xv2]ex'JIJY2$>EԺQsyndE+ȔN4$ ƖE,MdP IJ@+ Jdtu3Pz48@NJ,RjnbjcZORzR*Tvx?=>O{}}#y?[u;࠴ngӺnû^FyrL>n (::6!\A=rX֝cӗ~(xQCA%@Joa@ڊ5Tc΄'\¡Ŏq1}}ebGDj{s#%J s ΔWAr5&{ՠM4͚w Md"T>ލKL&1~fVwD)F<=}g'JR凿7|L/_߿SK|;Iӯ~.oޑTɦM~Y?<ǫ˛+32-˯Ϯ+\5߻R\~=)F~j.2AJ|wvD>ZU>\\F++5Xn6`EȻrQ 3 K[iEk>sr3ۙSsPiè)')=F)J)E=RhJ{#WU II&ZL*0dKkkrBO TBmFѺ jC;+ٲ@䬜 O4yTGJNsMf~y ^\l=69bJ 2s3:P{ȝ%WQ%`B∈Y][oG+_gރ=9lݗD_eƺb{O%p]]QJ=L ɷFw\b{&x 再.]`LoO)LӼ9%k>dD"ڵ4ߑxE) WFÇ~+@[P)E<Q54CIZS Vl ԼnM傫o' ^=or4JDR U\ćb[-ռ% !v&Ğ$s4u[HwW#`X"ղlp.b\xFiZ>׊?W֭ v}zR9Tbme'40] w;E/~䍋7.&o6yبZ0ufr2+d1';ɂ j  pSb O/:;)ng'f۷hbٍi޼d.^l6LbiTe6ǚѲJu.>9)!?_|Ÿq+m_Vhg4"d!x' xPcLN \38'=kZ E˷Vfs>mϧ~j+y۳GBVivWjܚ6k} #R>'"U k2 aN[P] (!Yi Wx!"r jb2 Q}0ZYt#fUnimFlҏ͈T̬I6hA".BA̠$6JQJC]/R*TyF4wQLkGGRhW}:Th&aq 8V}72/Kik=2:NQ$3yT㨨TG$3,FkȔh>M0F5g0]p/Q 1e吜řzV@=OE9϶_W^r/z7z94.V}xjv*_I\ZjSXLEQF 0Ȋj][us$krƜdN@"y[C޼߷ @&Nq0˿ok${P / _$W*%^Oo5_˲I."S&I@""av$*FqVWɬ2f0S/L1\{׸N;9N5-_\Up i:eM g}(L'wEv a r^ 9q_u~mw`C・ߠ㫮{T|{ŧiy_UMd wB3xZ--L>n̴x m学CvwȆe`܎n{踷nOZ7M%kUhږu})X:ՠN.C.{dʝ,A]vx-{˦7aW<''w=KSip}{ex_/ֳysr׊; g:͚_͟`zYKΫ]<5n`!;Ӽ%Ee y*SlqϢgJP9t,'ae''yc$y;Lh:lƇx! rsxf9zq7TeZ-ECgx& ׎xo| ŭlM6{+1#\}5FD>QIb,L:Ͻ97BjvozFԱ-ͳp[OQvja-^ea$p3y iZ8mY %'Ws-/IEj ,KF攏,S Xy wl\Vqco?X6Q+?Չ1{ā7*=}a}YR[9v]_'jo|>%:\ HєA @Dh ȧkQɵb(ST}@ǛFhA.$X+0u.]Ker*S? cL*МI ҉`!c|Ii=)A;l3[ǀX|HxK ϲF|Zn! UCy cT-xb߀F4A6zwskf&Oӛ1 w.+ Mأ;?Dw.Up֖0ۋl jfL ׁ2#rKA%L$.:?ۼ)σ/k!>q˗9AZ6r2xdƓǐ B4*N!7#;rv^+%UFe>iT΃R]ΌD8SAe:۞Ѫ\Šni~sBs({@hE\rVI+ \[*󇛏jd0}Yb,m3"]Ƹ-sZ6(jy@b 6egĨ(k# "}*7ba9 dc4QH2eY9"2M& ܦotLYf1d"u$i*#O(BDB@{a仴G}Ĉk 8O:.Xh/GHw:,HW`̗ ,p!JA; &I8X'㘥`)%%#p]#ZS G S.ri$FԒPBVb!GTYv:x $V1p SLK-;JrYe jPS/8F=E7E@Lb(p2S#cOz raќ5bLHa^ e1 8L8I#HcNr-e\߹Wj8de0"{"P\`"ybvx6s 8Zw@[zJVIV VH&z3[:M(VYzVբu~۝J|TTJgX…|Z NK50u0o]!-.;>B:!d hf*N,[UWnrY7,nBj*[FJa긕'[P*& +fm; +OYdA|u߽ygqU+wVw/7,nEgZWn0߯ﺏO: _ݺ^9ߙ!J˵i|.H])OKToLukK/z (hw[GJ_רVTc+I X5>]V!$U FZGoQڿֶCseML\\t ,uԅtl8mՌyMǛgA@S.12)s-+3T" !-@19j_ۏ ka{5 wZ]Yo#+^FZއv`$3#{ _8%%u{f_;b_b+[7f@t>g?=!Ff[hlFװ{_YL>M=_}]. Pc1,M;6c6;7Е7pmvq|hl8|Nj33@%J7zv­ʣm V l)MeSbpa^SJä{y"nu]aVmlݥRlغؼ{ wrAK'g48xzsf37Ngh8GKG 6CxwYV)A>Ѫ 9ϥzz~Rh @ogHWD|iP_ǎ'Q͑uwE>Tb\CzJFT ~ȻS9Ӕ1F:%4eJi#"Y$@nqCɸbTEX:KLkV40n*@],|_EM=;]k,57PNDCX}8fD J 1'Zbk =`뤄nok5zr| qۻҵwA܄;al+F3FUH, :*ӜJnpB.mW^XXJ%rzqQ;Q\IԴtsEBqL1J ֌ `J[]G[HAhGw_W;gh?+ɻnx]C[ixSس]vrPc1W:%&tx8ix<"!f,$*rdU%4M텃@BÅ?+$wG'o?%(Q|ejm ƀ.l ƀ.1MvFBFq$8B$ӄp" rD9=(|{Ik[>&A R&"NJFL, f1bi8m!4 C(!/HQQ$!Ŝï"M$"1771";H < q̅QºIN}7 5T$G1tM<;1bpE)`c*3dLFm8C⑦P"-$Q!iG0 GŜ++Pf0DbM bJ%*e^5QG"C:IcYN˧ /ҧ\CC3AFդzRy@1VUKE:jbh؍g!LmM37 0l`Cޔ'1T!.ߙkn7/̅@+tw,]#>B%}Zno7mpխPz{JKJlUޙmxw)Vw}/J"#+F,]FJl^bA^K.pCR༽潆$8> pABZ/sѫs!8a=Q>*BHIIl"Əml?/KK%6Mqk^+5>'WiQ-=م ;Vc;qy_[d{Z&:2:5{W~~Ƒ E*<WhOWYo'Mv4䅫hN 5zȺ)T*HS HtЦ t{0; =ibf37ﵢћ^3Rzl@ RI4ܲ-E;v {\ZА!:eK};nzNN;|ۀ ȔhАA:qY7u EuBc݆EHeo-ukCC^c!$ 5t5?k.K܏I=Avar/home/core/zuul-output/logs/kubelet.log0000644000000000000000002042261015157635444017710 0ustar rootrootMar 22 00:08:48 crc systemd[1]: Starting Kubernetes Kubelet... Mar 22 00:08:49 crc kubenswrapper[5116]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 22 00:08:49 crc kubenswrapper[5116]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 22 00:08:49 crc kubenswrapper[5116]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 22 00:08:49 crc kubenswrapper[5116]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 22 00:08:49 crc kubenswrapper[5116]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 22 00:08:49 crc kubenswrapper[5116]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.417222 5116 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421411 5116 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421439 5116 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421443 5116 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421447 5116 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421451 5116 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421454 5116 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421460 5116 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421463 5116 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421467 5116 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421470 5116 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421474 5116 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421478 5116 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421482 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421486 5116 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421489 5116 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421494 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421501 5116 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421505 5116 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421509 5116 feature_gate.go:328] unrecognized feature gate: Example Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421512 5116 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421517 5116 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421522 5116 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421527 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421533 5116 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421536 5116 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421539 5116 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421543 5116 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421546 5116 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421549 5116 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421552 5116 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421556 5116 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421559 5116 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421562 5116 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421565 5116 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421569 5116 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421572 5116 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421575 5116 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421578 5116 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421581 5116 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421585 5116 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421588 5116 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421592 5116 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421596 5116 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421600 5116 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421604 5116 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421607 5116 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421610 5116 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421615 5116 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421618 5116 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421621 5116 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421624 5116 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421629 5116 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421634 5116 feature_gate.go:328] unrecognized feature gate: Example2 Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421637 5116 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421640 5116 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421644 5116 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421647 5116 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421652 5116 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421655 5116 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421660 5116 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421663 5116 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421667 5116 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421671 5116 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421675 5116 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421678 5116 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421681 5116 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421684 5116 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421687 5116 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421690 5116 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421693 5116 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421697 5116 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421700 5116 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421705 5116 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421709 5116 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421713 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421718 5116 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421721 5116 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421724 5116 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421728 5116 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421731 5116 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421734 5116 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421737 5116 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421740 5116 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421744 5116 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421747 5116 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421750 5116 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422296 5116 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422306 5116 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422310 5116 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422315 5116 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422319 5116 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422323 5116 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422327 5116 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422331 5116 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422335 5116 feature_gate.go:328] unrecognized feature gate: Example2 Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422339 5116 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422342 5116 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422347 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422351 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422355 5116 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422359 5116 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422364 5116 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422367 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422371 5116 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422375 5116 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422379 5116 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422385 5116 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422389 5116 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422393 5116 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422397 5116 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422401 5116 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422404 5116 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422408 5116 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422411 5116 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422414 5116 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422417 5116 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422421 5116 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422424 5116 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422427 5116 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422430 5116 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422433 5116 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422436 5116 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422440 5116 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422444 5116 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422448 5116 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422452 5116 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422456 5116 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422461 5116 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422465 5116 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422469 5116 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422474 5116 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422477 5116 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422481 5116 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422486 5116 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422490 5116 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422493 5116 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422497 5116 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422500 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422504 5116 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422508 5116 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422511 5116 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422515 5116 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422519 5116 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422522 5116 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422525 5116 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422529 5116 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422533 5116 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422537 5116 feature_gate.go:328] unrecognized feature gate: Example Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422541 5116 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422559 5116 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422565 5116 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422569 5116 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422573 5116 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422578 5116 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422582 5116 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422586 5116 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422590 5116 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422593 5116 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422599 5116 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422604 5116 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422609 5116 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422613 5116 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422617 5116 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422624 5116 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422629 5116 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422634 5116 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422639 5116 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422643 5116 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422647 5116 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422651 5116 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422655 5116 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422660 5116 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.423969 5116 flags.go:64] FLAG: --address="0.0.0.0" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.423984 5116 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.423997 5116 flags.go:64] FLAG: --anonymous-auth="true" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424003 5116 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424008 5116 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424012 5116 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424018 5116 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424023 5116 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424027 5116 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424031 5116 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424035 5116 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424040 5116 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424043 5116 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424047 5116 flags.go:64] FLAG: --cgroup-root="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424051 5116 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424055 5116 flags.go:64] FLAG: --client-ca-file="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424059 5116 flags.go:64] FLAG: --cloud-config="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424062 5116 flags.go:64] FLAG: --cloud-provider="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424066 5116 flags.go:64] FLAG: --cluster-dns="[]" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424072 5116 flags.go:64] FLAG: --cluster-domain="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424075 5116 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424079 5116 flags.go:64] FLAG: --config-dir="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424082 5116 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424086 5116 flags.go:64] FLAG: --container-log-max-files="5" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424091 5116 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424095 5116 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424099 5116 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424103 5116 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424107 5116 flags.go:64] FLAG: --contention-profiling="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424111 5116 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424114 5116 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424119 5116 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424122 5116 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424133 5116 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424138 5116 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424142 5116 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424146 5116 flags.go:64] FLAG: --enable-load-reader="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424150 5116 flags.go:64] FLAG: --enable-server="true" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424154 5116 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424159 5116 flags.go:64] FLAG: --event-burst="100" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424164 5116 flags.go:64] FLAG: --event-qps="50" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424193 5116 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424199 5116 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424204 5116 flags.go:64] FLAG: --eviction-hard="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424210 5116 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424213 5116 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424217 5116 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424222 5116 flags.go:64] FLAG: --eviction-soft="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424225 5116 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424230 5116 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424235 5116 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424242 5116 flags.go:64] FLAG: --experimental-mounter-path="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424251 5116 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424255 5116 flags.go:64] FLAG: --fail-swap-on="true" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424260 5116 flags.go:64] FLAG: --feature-gates="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424266 5116 flags.go:64] FLAG: --file-check-frequency="20s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424271 5116 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424276 5116 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424281 5116 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424286 5116 flags.go:64] FLAG: --healthz-port="10248" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424290 5116 flags.go:64] FLAG: --help="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424298 5116 flags.go:64] FLAG: --hostname-override="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424303 5116 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424307 5116 flags.go:64] FLAG: --http-check-frequency="20s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424312 5116 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424316 5116 flags.go:64] FLAG: --image-credential-provider-config="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424326 5116 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424331 5116 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424335 5116 flags.go:64] FLAG: --image-service-endpoint="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424340 5116 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424344 5116 flags.go:64] FLAG: --kube-api-burst="100" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424348 5116 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424353 5116 flags.go:64] FLAG: --kube-api-qps="50" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424358 5116 flags.go:64] FLAG: --kube-reserved="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424363 5116 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424367 5116 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424372 5116 flags.go:64] FLAG: --kubelet-cgroups="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424377 5116 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424382 5116 flags.go:64] FLAG: --lock-file="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424386 5116 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424390 5116 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424395 5116 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424402 5116 flags.go:64] FLAG: --log-json-split-stream="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424406 5116 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424410 5116 flags.go:64] FLAG: --log-text-split-stream="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424414 5116 flags.go:64] FLAG: --logging-format="text" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424418 5116 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424423 5116 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424427 5116 flags.go:64] FLAG: --manifest-url="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424431 5116 flags.go:64] FLAG: --manifest-url-header="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424438 5116 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424442 5116 flags.go:64] FLAG: --max-open-files="1000000" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424448 5116 flags.go:64] FLAG: --max-pods="110" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424456 5116 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424461 5116 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424465 5116 flags.go:64] FLAG: --memory-manager-policy="None" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424470 5116 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424475 5116 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424481 5116 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424488 5116 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhel" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424501 5116 flags.go:64] FLAG: --node-status-max-images="50" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424505 5116 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424508 5116 flags.go:64] FLAG: --oom-score-adj="-999" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424512 5116 flags.go:64] FLAG: --pod-cidr="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424516 5116 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc2b30e70040205c2536d01ae5c850be1ed2d775cf13249e50328e5085777977" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424525 5116 flags.go:64] FLAG: --pod-manifest-path="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424529 5116 flags.go:64] FLAG: --pod-max-pids="-1" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424532 5116 flags.go:64] FLAG: --pods-per-core="0" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424536 5116 flags.go:64] FLAG: --port="10250" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424540 5116 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424543 5116 flags.go:64] FLAG: --provider-id="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424547 5116 flags.go:64] FLAG: --qos-reserved="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424551 5116 flags.go:64] FLAG: --read-only-port="10255" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424555 5116 flags.go:64] FLAG: --register-node="true" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424559 5116 flags.go:64] FLAG: --register-schedulable="true" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424562 5116 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424569 5116 flags.go:64] FLAG: --registry-burst="10" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424573 5116 flags.go:64] FLAG: --registry-qps="5" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424576 5116 flags.go:64] FLAG: --reserved-cpus="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424580 5116 flags.go:64] FLAG: --reserved-memory="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424584 5116 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424588 5116 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424591 5116 flags.go:64] FLAG: --rotate-certificates="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424595 5116 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424598 5116 flags.go:64] FLAG: --runonce="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424602 5116 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424608 5116 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424612 5116 flags.go:64] FLAG: --seccomp-default="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424615 5116 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424619 5116 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424626 5116 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424630 5116 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424634 5116 flags.go:64] FLAG: --storage-driver-password="root" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424638 5116 flags.go:64] FLAG: --storage-driver-secure="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424642 5116 flags.go:64] FLAG: --storage-driver-table="stats" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424646 5116 flags.go:64] FLAG: --storage-driver-user="root" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424650 5116 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424654 5116 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424657 5116 flags.go:64] FLAG: --system-cgroups="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424661 5116 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424667 5116 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424671 5116 flags.go:64] FLAG: --tls-cert-file="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424674 5116 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424681 5116 flags.go:64] FLAG: --tls-min-version="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424684 5116 flags.go:64] FLAG: --tls-private-key-file="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424688 5116 flags.go:64] FLAG: --topology-manager-policy="none" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424691 5116 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424695 5116 flags.go:64] FLAG: --topology-manager-scope="container" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424699 5116 flags.go:64] FLAG: --v="2" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424705 5116 flags.go:64] FLAG: --version="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424710 5116 flags.go:64] FLAG: --vmodule="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424715 5116 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424719 5116 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424813 5116 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424817 5116 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424821 5116 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424824 5116 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424828 5116 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424833 5116 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424837 5116 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424840 5116 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424843 5116 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424847 5116 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424851 5116 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424854 5116 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424858 5116 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424861 5116 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424865 5116 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424868 5116 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424871 5116 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424874 5116 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424877 5116 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424881 5116 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424884 5116 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424887 5116 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424890 5116 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424893 5116 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424896 5116 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424900 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424903 5116 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424906 5116 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424910 5116 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424914 5116 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424917 5116 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424920 5116 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424923 5116 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424926 5116 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424930 5116 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424933 5116 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424936 5116 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424941 5116 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424944 5116 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424947 5116 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424950 5116 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424954 5116 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424957 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424960 5116 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424963 5116 feature_gate.go:328] unrecognized feature gate: Example2 Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424968 5116 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424971 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424974 5116 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424977 5116 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424980 5116 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424983 5116 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424986 5116 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424990 5116 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424993 5116 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424996 5116 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425000 5116 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425003 5116 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425006 5116 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425010 5116 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425013 5116 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425016 5116 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425020 5116 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425024 5116 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425028 5116 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425032 5116 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425035 5116 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425038 5116 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425041 5116 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425045 5116 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425050 5116 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425053 5116 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425056 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425060 5116 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425063 5116 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425066 5116 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425069 5116 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425072 5116 feature_gate.go:328] unrecognized feature gate: Example Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425076 5116 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425080 5116 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425083 5116 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425086 5116 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425090 5116 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425095 5116 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425098 5116 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425101 5116 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425105 5116 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.426593 5116 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.437059 5116 server.go:530] "Kubelet version" kubeletVersion="v1.33.5" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.437443 5116 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437501 5116 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437508 5116 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437512 5116 feature_gate.go:328] unrecognized feature gate: Example Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437516 5116 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437520 5116 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437523 5116 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437528 5116 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437534 5116 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437538 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437541 5116 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437545 5116 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437549 5116 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437552 5116 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437556 5116 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437559 5116 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437562 5116 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437565 5116 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437569 5116 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437572 5116 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437575 5116 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437578 5116 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437584 5116 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437588 5116 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437592 5116 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437595 5116 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437598 5116 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437602 5116 feature_gate.go:328] unrecognized feature gate: Example2 Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437606 5116 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437610 5116 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437614 5116 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437617 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437621 5116 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437626 5116 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437630 5116 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437634 5116 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437638 5116 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437642 5116 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437646 5116 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437649 5116 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437654 5116 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437658 5116 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437662 5116 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437665 5116 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437668 5116 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437672 5116 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437675 5116 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437678 5116 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437681 5116 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437684 5116 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437688 5116 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437691 5116 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437694 5116 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437697 5116 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437701 5116 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437705 5116 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437708 5116 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437712 5116 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437715 5116 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437719 5116 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437722 5116 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437725 5116 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437729 5116 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437732 5116 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437735 5116 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437738 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437741 5116 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437746 5116 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437750 5116 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437753 5116 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437756 5116 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437759 5116 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437763 5116 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437767 5116 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437770 5116 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437773 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437776 5116 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437780 5116 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437783 5116 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437786 5116 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437790 5116 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437793 5116 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437796 5116 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437799 5116 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437802 5116 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437805 5116 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437809 5116 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.437816 5116 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437918 5116 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437924 5116 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437928 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437931 5116 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437935 5116 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437938 5116 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437942 5116 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437945 5116 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437948 5116 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437952 5116 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437957 5116 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437960 5116 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437964 5116 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437967 5116 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437970 5116 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437973 5116 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437977 5116 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437981 5116 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437984 5116 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437987 5116 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437990 5116 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437994 5116 feature_gate.go:328] unrecognized feature gate: Example Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437997 5116 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438000 5116 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438004 5116 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438007 5116 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438010 5116 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438013 5116 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438016 5116 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438019 5116 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438023 5116 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438026 5116 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438030 5116 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438033 5116 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438036 5116 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438040 5116 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438043 5116 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438046 5116 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438049 5116 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438053 5116 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438056 5116 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438060 5116 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438063 5116 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438066 5116 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438070 5116 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438073 5116 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438076 5116 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438079 5116 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438083 5116 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438086 5116 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438089 5116 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438092 5116 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438096 5116 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438099 5116 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438102 5116 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438105 5116 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438108 5116 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438112 5116 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438116 5116 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438121 5116 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438125 5116 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438129 5116 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438132 5116 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438136 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438139 5116 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438142 5116 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438146 5116 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438150 5116 feature_gate.go:328] unrecognized feature gate: Example2 Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438153 5116 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438156 5116 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438159 5116 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438177 5116 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438181 5116 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438185 5116 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438189 5116 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438194 5116 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438198 5116 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438202 5116 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438206 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438210 5116 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438215 5116 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438219 5116 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438223 5116 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438228 5116 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438232 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438236 5116 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.438243 5116 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.439022 5116 server.go:962] "Client rotation is on, will bootstrap in background" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.442845 5116 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2025-12-03 08:27:53 +0000 UTC" logger="UnhandledError" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.445936 5116 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.446032 5116 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.446982 5116 server.go:1019] "Starting client certificate rotation" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.447126 5116 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.447225 5116 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.471825 5116 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.474959 5116 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.475368 5116 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.488138 5116 log.go:25] "Validated CRI v1 runtime API" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.535685 5116 log.go:25] "Validated CRI v1 image API" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.539447 5116 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.542995 5116 fs.go:135] Filesystem UUIDs: map[19e76f87-96b8-4794-9744-0b33dca22d5b:/dev/vda3 2026-03-22-00-02-24-00:/dev/sr0 5eb7c122-420e-4494-80ec-41664070d7b6:/dev/vda4 7B77-95E7:/dev/vda2] Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.543042 5116 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:44 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:45 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.557471 5116 manager.go:217] Machine: {Timestamp:2026-03-22 00:08:49.555508433 +0000 UTC m=+0.577809826 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33649926144 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:80bc4fba336e4ca1bc9d28a8be52a356 SystemUUID:6f1c9f09-bd93-4412-afb3-903004a8bcf7 BootID:4e17d39b-4bf4-4f5d-b01b-aaffc38eb890 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16824963072 Type:vfs Inodes:4107657 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6729986048 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6545408 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16824963072 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:44 Capacity:3364990976 Type:vfs Inodes:821531 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:45 Capacity:1073741824 Type:vfs Inodes:4107657 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:12:96:9a Speed:0 Mtu:1500} {Name:br-int MacAddress:b2:a9:9f:57:07:84 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:12:96:9a Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:7b:9f:99 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:bb:8d:58 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:4c:e6:53 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:c1:20:11 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:a6:05:59:a8:19:71 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:d6:f7:76:37:16:06 Speed:0 Mtu:1500} {Name:tap0 MacAddress:5a:94:ef:e4:0c:ee Speed:10 Mtu:1500}] Topology:[{Id:0 Memory:33649926144 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.557711 5116 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.558044 5116 manager.go:233] Version: {KernelVersion:5.14.0-570.57.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20251021-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.560238 5116 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.560280 5116 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.560520 5116 topology_manager.go:138] "Creating topology manager with none policy" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.560536 5116 container_manager_linux.go:306] "Creating device plugin manager" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.560561 5116 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.562648 5116 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.562981 5116 state_mem.go:36] "Initialized new in-memory state store" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.563183 5116 server.go:1267] "Using root directory" path="/var/lib/kubelet" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.567522 5116 kubelet.go:491] "Attempting to sync node with API server" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.567631 5116 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.567661 5116 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.567682 5116 kubelet.go:397] "Adding apiserver pod source" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.567698 5116 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.573620 5116 state_checkpoint.go:81] "State checkpoint: restored pod resource state from checkpoint" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.573663 5116 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.577827 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.577823 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.578705 5116 state_checkpoint.go:81] "State checkpoint: restored pod resource state from checkpoint" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.578725 5116 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.587409 5116 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.5-3.rhaos4.20.gitd0ea985.el9" apiVersion="v1" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.587865 5116 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-server-current.pem" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.588687 5116 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.589615 5116 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.589636 5116 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.589644 5116 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.589650 5116 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.589657 5116 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.589664 5116 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.589671 5116 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.589678 5116 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.589687 5116 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.589700 5116 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.589710 5116 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.590493 5116 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.590543 5116 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.590553 5116 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.594127 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.612285 5116 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.612864 5116 server.go:1295] "Started kubelet" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.612850 5116 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.612921 5116 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.613452 5116 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.613838 5116 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 22 00:08:49 crc systemd[1]: Started Kubernetes Kubelet. Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.616046 5116 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.616490 5116 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.617577 5116 server.go:317] "Adding debug handlers to kubelet server" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.618317 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="200ms" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.619199 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.619340 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.619542 5116 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.619155 5116 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.223:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189f013aa607e028 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.612529704 +0000 UTC m=+0.634831087,LastTimestamp:2026-03-22 00:08:49.612529704 +0000 UTC m=+0.634831087,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.624214 5116 volume_manager.go:295] "The desired_state_of_world populator starts" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.624278 5116 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.624326 5116 factory.go:153] Registering CRI-O factory Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.624361 5116 factory.go:223] Registration of the crio container factory successfully Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.624441 5116 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.624452 5116 factory.go:55] Registering systemd factory Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.624458 5116 factory.go:223] Registration of the systemd container factory successfully Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.624475 5116 factory.go:103] Registering Raw factory Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.624486 5116 manager.go:1196] Started watching for new ooms in manager Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.625469 5116 manager.go:319] Starting recovery of all containers Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.648737 5116 manager.go:324] Recovery completed Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.669732 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.671628 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.671697 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.671710 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.672440 5116 cpu_manager.go:222] "Starting CPU manager" policy="none" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.672462 5116 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.672487 5116 state_mem.go:36] "Initialized new in-memory state store" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.680047 5116 policy_none.go:49] "None policy: Start" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.680547 5116 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.680566 5116 state_mem.go:35] "Initializing new in-memory state store" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.694252 5116 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.696083 5116 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.696130 5116 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.696157 5116 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.696211 5116 kubelet.go:2451] "Starting kubelet main sync loop" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.697836 5116 kubelet.go:2475] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.698116 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705055 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="34177974-8d82-49d2-a763-391d0df3bbd8" volumeName="kubernetes.io/projected/34177974-8d82-49d2-a763-391d0df3bbd8-kube-api-access-m7xz2" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705113 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-audit-policies" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705139 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a7a88189-c967-4640-879e-27665747f20c" volumeName="kubernetes.io/secret/a7a88189-c967-4640-879e-27665747f20c-webhook-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705150 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-session" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705161 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a" volumeName="kubernetes.io/empty-dir/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-utilities" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705199 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9f71a554-e414-4bc3-96d2-674060397afe" volumeName="kubernetes.io/projected/9f71a554-e414-4bc3-96d2-674060397afe-kube-api-access-ftwb6" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705211 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-image-import-ca" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705223 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" volumeName="kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-service-ca" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705239 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" volumeName="kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705282 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705328 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="736c54fe-349c-4bb9-870a-d1c1d1c03831" volumeName="kubernetes.io/empty-dir/736c54fe-349c-4bb9-870a-d1c1d1c03831-tmp" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705337 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b4750666-1362-4001-abd0-6f89964cc621" volumeName="kubernetes.io/projected/b4750666-1362-4001-abd0-6f89964cc621-kube-api-access-twvbl" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705346 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce090a97-9ab6-4c40-a719-64ff2acd9778" volumeName="kubernetes.io/secret/ce090a97-9ab6-4c40-a719-64ff2acd9778-signing-key" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705378 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="149b3c48-e17c-4a66-a835-d86dabf6ff13" volumeName="kubernetes.io/projected/149b3c48-e17c-4a66-a835-d86dabf6ff13-kube-api-access-wj4qr" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705392 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="736c54fe-349c-4bb9-870a-d1c1d1c03831" volumeName="kubernetes.io/secret/736c54fe-349c-4bb9-870a-d1c1d1c03831-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705411 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9f0bc7fcb0822a2c13eb2d22cd8c0641" volumeName="kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-ca-trust-dir" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705420 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af33e427-6803-48c2-a76a-dd9deb7cbf9a" volumeName="kubernetes.io/secret/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovn-node-metrics-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705430 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc4541ce-7789-4670-bc75-5c2868e52ce0" volumeName="kubernetes.io/configmap/fc4541ce-7789-4670-bc75-5c2868e52ce0-env-overrides" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705453 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff" volumeName="kubernetes.io/empty-dir/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-utilities" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705483 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="42a11a02-47e1-488f-b270-2679d3298b0e" volumeName="kubernetes.io/secret/42a11a02-47e1-488f-b270-2679d3298b0e-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705494 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01080b46-74f1-4191-8755-5152a57b3b25" volumeName="kubernetes.io/configmap/01080b46-74f1-4191-8755-5152a57b3b25-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705506 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2325ffef-9d5b-447f-b00e-3efc429acefe" volumeName="kubernetes.io/configmap/2325ffef-9d5b-447f-b00e-3efc429acefe-trusted-ca" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705516 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" volumeName="kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705526 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="736c54fe-349c-4bb9-870a-d1c1d1c03831" volumeName="kubernetes.io/projected/736c54fe-349c-4bb9-870a-d1c1d1c03831-kube-api-access-6dmhf" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705534 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9f0bc7fcb0822a2c13eb2d22cd8c0641" volumeName="kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-tmp-dir" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705544 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af33e427-6803-48c2-a76a-dd9deb7cbf9a" volumeName="kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovnkube-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705583 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f7e2c886-118e-43bb-bef1-c78134de392b" volumeName="kubernetes.io/projected/f7e2c886-118e-43bb-bef1-c78134de392b-kube-api-access-6g4lr" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705594 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="428b39f5-eb1c-4f65-b7a4-eeb6e84860cc" volumeName="kubernetes.io/configmap/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-iptables-alerter-script" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705619 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09cfa50b-4138-4585-a53e-64dd3ab73335" volumeName="kubernetes.io/configmap/09cfa50b-4138-4585-a53e-64dd3ab73335-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705630 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="17b87002-b798-480a-8e17-83053d698239" volumeName="kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705639 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" volumeName="kubernetes.io/secret/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705649 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7df94c10-441d-4386-93a6-6730fb7bcde0" volumeName="kubernetes.io/secret/7df94c10-441d-4386-93a6-6730fb7bcde0-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705661 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" volumeName="kubernetes.io/configmap/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-marketplace-trusted-ca" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705668 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" volumeName="kubernetes.io/secret/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705686 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a52afe44-fb37-46ed-a1f8-bf39727a3cbe" volumeName="kubernetes.io/projected/a52afe44-fb37-46ed-a1f8-bf39727a3cbe-kube-api-access-rzt4w" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705694 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d7e8f42f-dc0e-424b-bb56-5ec849834888" volumeName="kubernetes.io/projected/d7e8f42f-dc0e-424b-bb56-5ec849834888-kube-api-access" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705703 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="18f80adb-c1c3-49ba-8ee4-932c851d3897" volumeName="kubernetes.io/configmap/18f80adb-c1c3-49ba-8ee4-932c851d3897-service-ca-bundle" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705712 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3a14caf222afb62aaabdc47808b6f944" volumeName="kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-ca-bundle-dir" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705725 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="92dfbade-90b6-4169-8c07-72cff7f2c82b" volumeName="kubernetes.io/configmap/92dfbade-90b6-4169-8c07-72cff7f2c82b-config-volume" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705757 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af33e427-6803-48c2-a76a-dd9deb7cbf9a" volumeName="kubernetes.io/projected/af33e427-6803-48c2-a76a-dd9deb7cbf9a-kube-api-access-z5rsr" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705779 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc4541ce-7789-4670-bc75-5c2868e52ce0" volumeName="kubernetes.io/projected/fc4541ce-7789-4670-bc75-5c2868e52ce0-kube-api-access-8nt2j" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705788 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2325ffef-9d5b-447f-b00e-3efc429acefe" volumeName="kubernetes.io/projected/2325ffef-9d5b-447f-b00e-3efc429acefe-kube-api-access-zg8nc" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706006 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9e9b5059-1b3e-4067-a63d-2952cbe863af" volumeName="kubernetes.io/secret/9e9b5059-1b3e-4067-a63d-2952cbe863af-installation-pull-secrets" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706016 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e093be35-bb62-4843-b2e8-094545761610" volumeName="kubernetes.io/projected/e093be35-bb62-4843-b2e8-094545761610-kube-api-access-pddnv" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706027 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5ebfebf6-3ecd-458e-943f-bb25b52e2718" volumeName="kubernetes.io/projected/5ebfebf6-3ecd-458e-943f-bb25b52e2718-kube-api-access-l87hs" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706036 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a555ff2e-0be6-46d5-897d-863bb92ae2b3" volumeName="kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706044 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" volumeName="kubernetes.io/projected/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-kube-api-access-qqbfk" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706088 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c491984c-7d4b-44aa-8c1e-d7974424fa47" volumeName="kubernetes.io/secret/c491984c-7d4b-44aa-8c1e-d7974424fa47-machine-api-operator-tls" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706096 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e1d2a42d-af1d-4054-9618-ab545e0ed8b7" volumeName="kubernetes.io/configmap/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-mcd-auth-proxy-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706126 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f559dfa3-3917-43a2-97f6-61ddfda10e93" volumeName="kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-encryption-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706147 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="584e1f4a-8205-47d7-8efb-3afc6017c4c9" volumeName="kubernetes.io/projected/584e1f4a-8205-47d7-8efb-3afc6017c4c9-kube-api-access-tknt7" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706158 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" volumeName="kubernetes.io/secret/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-profile-collector-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706185 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b605f283-6f2e-42da-a838-54421690f7d0" volumeName="kubernetes.io/empty-dir/b605f283-6f2e-42da-a838-54421690f7d0-catalog-content" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706198 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cc85e424-18b2-4924-920b-bd291a8c4b01" volumeName="kubernetes.io/projected/cc85e424-18b2-4924-920b-bd291a8c4b01-kube-api-access-xfp5s" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706208 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-etcd-client" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706218 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d565531a-ff86-4608-9d19-767de01ac31b" volumeName="kubernetes.io/projected/d565531a-ff86-4608-9d19-767de01ac31b-kube-api-access-99zj9" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706247 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b605f283-6f2e-42da-a838-54421690f7d0" volumeName="kubernetes.io/projected/b605f283-6f2e-42da-a838-54421690f7d0-kube-api-access-6rmnv" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706294 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc8db2c7-859d-47b3-a900-2bd0c0b2973b" volumeName="kubernetes.io/configmap/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-auth-proxy-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706318 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a" volumeName="kubernetes.io/projected/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-kube-api-access-ks6v2" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706328 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="81e39f7b-62e4-4fc9-992a-6535ce127a02" volumeName="kubernetes.io/configmap/81e39f7b-62e4-4fc9-992a-6535ce127a02-multus-daemon-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706336 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="16bdd140-dce1-464c-ab47-dd5798d1d256" volumeName="kubernetes.io/empty-dir/16bdd140-dce1-464c-ab47-dd5798d1d256-available-featuregates" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706347 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706358 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7afa918d-be67-40a6-803c-d3b0ae99d815" volumeName="kubernetes.io/empty-dir/7afa918d-be67-40a6-803c-d3b0ae99d815-tmp" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706369 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706379 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="869851b9-7ffb-4af0-b166-1d8aa40a5f80" volumeName="kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-cni-binary-copy" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706388 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c5f2bfad-70f6-4185-a3d9-81ce12720767" volumeName="kubernetes.io/secret/c5f2bfad-70f6-4185-a3d9-81ce12720767-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706410 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-trusted-ca-bundle" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706420 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0dd0fbac-8c0d-4228-8faa-abbeedabf7db" volumeName="kubernetes.io/secret/0dd0fbac-8c0d-4228-8faa-abbeedabf7db-webhook-certs" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706428 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a555ff2e-0be6-46d5-897d-863bb92ae2b3" volumeName="kubernetes.io/empty-dir/a555ff2e-0be6-46d5-897d-863bb92ae2b3-tmp" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706436 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc8db2c7-859d-47b3-a900-2bd0c0b2973b" volumeName="kubernetes.io/projected/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-kube-api-access-hckvg" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706446 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" volumeName="kubernetes.io/empty-dir/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-tmp" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706455 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2325ffef-9d5b-447f-b00e-3efc429acefe" volumeName="kubernetes.io/secret/2325ffef-9d5b-447f-b00e-3efc429acefe-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706477 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a" volumeName="kubernetes.io/empty-dir/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-catalog-content" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706486 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="869851b9-7ffb-4af0-b166-1d8aa40a5f80" volumeName="kubernetes.io/projected/869851b9-7ffb-4af0-b166-1d8aa40a5f80-kube-api-access-mjwtd" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706510 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff" volumeName="kubernetes.io/empty-dir/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-catalog-content" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706519 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f7e2c886-118e-43bb-bef1-c78134de392b" volumeName="kubernetes.io/empty-dir/f7e2c886-118e-43bb-bef1-c78134de392b-tmp-dir" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706528 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b638b8f4bb0070e40528db779baf6a2" volumeName="kubernetes.io/empty-dir/0b638b8f4bb0070e40528db779baf6a2-tmp" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706535 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" volumeName="kubernetes.io/secret/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706543 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" volumeName="kubernetes.io/empty-dir/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-tmp" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706567 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c5f2bfad-70f6-4185-a3d9-81ce12720767" volumeName="kubernetes.io/empty-dir/c5f2bfad-70f6-4185-a3d9-81ce12720767-tmp-dir" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706575 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="149b3c48-e17c-4a66-a835-d86dabf6ff13" volumeName="kubernetes.io/empty-dir/149b3c48-e17c-4a66-a835-d86dabf6ff13-utilities" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706611 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ee8fbd3-1f81-4666-96da-5afc70819f1a" volumeName="kubernetes.io/secret/6ee8fbd3-1f81-4666-96da-5afc70819f1a-samples-operator-tls" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706659 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706682 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f65c0ac1-8bca-454d-a2e6-e35cb418beac" volumeName="kubernetes.io/secret/f65c0ac1-8bca-454d-a2e6-e35cb418beac-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706693 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="94a6e063-3d1a-4d44-875d-185291448c31" volumeName="kubernetes.io/empty-dir/94a6e063-3d1a-4d44-875d-185291448c31-catalog-content" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706718 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9e9b5059-1b3e-4067-a63d-2952cbe863af" volumeName="kubernetes.io/empty-dir/9e9b5059-1b3e-4067-a63d-2952cbe863af-ca-trust-extracted" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706727 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f559dfa3-3917-43a2-97f6-61ddfda10e93" volumeName="kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-etcd-serving-ca" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706738 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="567683bd-0efc-4f21-b076-e28559628404" volumeName="kubernetes.io/empty-dir/567683bd-0efc-4f21-b076-e28559628404-tmp-dir" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706752 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af33e427-6803-48c2-a76a-dd9deb7cbf9a" volumeName="kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-env-overrides" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706760 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce090a97-9ab6-4c40-a719-64ff2acd9778" volumeName="kubernetes.io/configmap/ce090a97-9ab6-4c40-a719-64ff2acd9778-signing-cabundle" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706772 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce090a97-9ab6-4c40-a719-64ff2acd9778" volumeName="kubernetes.io/projected/ce090a97-9ab6-4c40-a719-64ff2acd9778-kube-api-access-xnxbn" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706780 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="18f80adb-c1c3-49ba-8ee4-932c851d3897" volumeName="kubernetes.io/projected/18f80adb-c1c3-49ba-8ee4-932c851d3897-kube-api-access-wbmqg" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706789 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="567683bd-0efc-4f21-b076-e28559628404" volumeName="kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-etcd-service-ca" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706798 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="584e1f4a-8205-47d7-8efb-3afc6017c4c9" volumeName="kubernetes.io/empty-dir/584e1f4a-8205-47d7-8efb-3afc6017c4c9-catalog-content" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706807 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" volumeName="kubernetes.io/secret/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-oauth-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706818 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7afa918d-be67-40a6-803c-d3b0ae99d815" volumeName="kubernetes.io/secret/7afa918d-be67-40a6-803c-d3b0ae99d815-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706827 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="92dfbade-90b6-4169-8c07-72cff7f2c82b" volumeName="kubernetes.io/empty-dir/92dfbade-90b6-4169-8c07-72cff7f2c82b-tmp-dir" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706836 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b4750666-1362-4001-abd0-6f89964cc621" volumeName="kubernetes.io/secret/b4750666-1362-4001-abd0-6f89964cc621-proxy-tls" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706847 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="16bdd140-dce1-464c-ab47-dd5798d1d256" volumeName="kubernetes.io/projected/16bdd140-dce1-464c-ab47-dd5798d1d256-kube-api-access-94l9h" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706856 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3a14caf222afb62aaabdc47808b6f944" volumeName="kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-tmp-dir" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706866 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" volumeName="kubernetes.io/empty-dir/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-tmpfs" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706876 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="869851b9-7ffb-4af0-b166-1d8aa40a5f80" volumeName="kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-cni-sysctl-allowlist" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706884 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-etcd-serving-ca" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706892 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4" volumeName="kubernetes.io/secret/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4-metrics-certs" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706905 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc4541ce-7789-4670-bc75-5c2868e52ce0" volumeName="kubernetes.io/secret/fc4541ce-7789-4670-bc75-5c2868e52ce0-webhook-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706912 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c491984c-7d4b-44aa-8c1e-d7974424fa47" volumeName="kubernetes.io/configmap/c491984c-7d4b-44aa-8c1e-d7974424fa47-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706922 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0effdbcf-dd7d-404d-9d48-77536d665a5d" volumeName="kubernetes.io/projected/0effdbcf-dd7d-404d-9d48-77536d665a5d-kube-api-access-mfzkj" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706931 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ee8fbd3-1f81-4666-96da-5afc70819f1a" volumeName="kubernetes.io/projected/6ee8fbd3-1f81-4666-96da-5afc70819f1a-kube-api-access-d4tqq" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706940 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d7e8f42f-dc0e-424b-bb56-5ec849834888" volumeName="kubernetes.io/configmap/d7e8f42f-dc0e-424b-bb56-5ec849834888-service-ca" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706949 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="18f80adb-c1c3-49ba-8ee4-932c851d3897" volumeName="kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-metrics-certs" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706958 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20c5c5b4bed930554494851fe3cb2b2a" volumeName="kubernetes.io/empty-dir/20c5c5b4bed930554494851fe3cb2b2a-tmp-dir" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706975 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2325ffef-9d5b-447f-b00e-3efc429acefe" volumeName="kubernetes.io/configmap/2325ffef-9d5b-447f-b00e-3efc429acefe-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707002 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7df94c10-441d-4386-93a6-6730fb7bcde0" volumeName="kubernetes.io/configmap/7df94c10-441d-4386-93a6-6730fb7bcde0-env-overrides" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707012 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a555ff2e-0be6-46d5-897d-863bb92ae2b3" volumeName="kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-client-ca" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707020 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707029 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f559dfa3-3917-43a2-97f6-61ddfda10e93" volumeName="kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-audit-policies" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707045 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="301e1965-1754-483d-b6cc-bfae7038bbca" volumeName="kubernetes.io/secret/301e1965-1754-483d-b6cc-bfae7038bbca-srv-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707054 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707063 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a555ff2e-0be6-46d5-897d-863bb92ae2b3" volumeName="kubernetes.io/projected/a555ff2e-0be6-46d5-897d-863bb92ae2b3-kube-api-access-8pskd" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707071 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/projected/d19cb085-0c5b-4810-b654-ce7923221d90-kube-api-access-m5lgh" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707083 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" volumeName="kubernetes.io/configmap/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-trusted-ca" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707092 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31fa8943-81cc-4750-a0b7-0fa9ab5af883" volumeName="kubernetes.io/projected/31fa8943-81cc-4750-a0b7-0fa9ab5af883-kube-api-access-grwfz" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707102 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="567683bd-0efc-4f21-b076-e28559628404" volumeName="kubernetes.io/secret/567683bd-0efc-4f21-b076-e28559628404-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707111 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff" volumeName="kubernetes.io/projected/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-kube-api-access-xxfcv" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707120 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a208c9c2-333b-4b4a-be0d-bc32ec38a821" volumeName="kubernetes.io/projected/a208c9c2-333b-4b4a-be0d-bc32ec38a821-kube-api-access-26xrl" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707133 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" volumeName="kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707142 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c5f2bfad-70f6-4185-a3d9-81ce12720767" volumeName="kubernetes.io/projected/c5f2bfad-70f6-4185-a3d9-81ce12720767-kube-api-access" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707152 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/projected/6edfcf45-925b-4eff-b940-95b6fc0b85d4-kube-api-access-8nb9c" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707161 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7599e0b6-bddf-4def-b7f2-0b32206e8651" volumeName="kubernetes.io/configmap/7599e0b6-bddf-4def-b7f2-0b32206e8651-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707213 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7599e0b6-bddf-4def-b7f2-0b32206e8651" volumeName="kubernetes.io/secret/7599e0b6-bddf-4def-b7f2-0b32206e8651-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707226 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9e9b5059-1b3e-4067-a63d-2952cbe863af" volumeName="kubernetes.io/configmap/9e9b5059-1b3e-4067-a63d-2952cbe863af-registry-certificates" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707236 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a7a88189-c967-4640-879e-27665747f20c" volumeName="kubernetes.io/secret/a7a88189-c967-4640-879e-27665747f20c-apiservice-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707257 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="92dfbade-90b6-4169-8c07-72cff7f2c82b" volumeName="kubernetes.io/projected/92dfbade-90b6-4169-8c07-72cff7f2c82b-kube-api-access-4g8ts" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707268 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7afa918d-be67-40a6-803c-d3b0ae99d815" volumeName="kubernetes.io/projected/7afa918d-be67-40a6-803c-d3b0ae99d815-kube-api-access" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707276 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a555ff2e-0be6-46d5-897d-863bb92ae2b3" volumeName="kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-proxy-ca-bundles" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707287 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" volumeName="kubernetes.io/secret/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-marketplace-operator-metrics" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707296 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="593a3561-7760-45c5-8f91-5aaef7475d0f" volumeName="kubernetes.io/secret/593a3561-7760-45c5-8f91-5aaef7475d0f-node-bootstrap-token" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707307 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d565531a-ff86-4608-9d19-767de01ac31b" volumeName="kubernetes.io/configmap/d565531a-ff86-4608-9d19-767de01ac31b-auth-proxy-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707316 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="428b39f5-eb1c-4f65-b7a4-eeb6e84860cc" volumeName="kubernetes.io/projected/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-kube-api-access-dsgwk" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707326 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="301e1965-1754-483d-b6cc-bfae7038bbca" volumeName="kubernetes.io/empty-dir/301e1965-1754-483d-b6cc-bfae7038bbca-tmpfs" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707335 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6077b63e-53a2-4f96-9d56-1ce0324e4913" volumeName="kubernetes.io/empty-dir/6077b63e-53a2-4f96-9d56-1ce0324e4913-tmp-dir" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707345 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9f71a554-e414-4bc3-96d2-674060397afe" volumeName="kubernetes.io/configmap/9f71a554-e414-4bc3-96d2-674060397afe-trusted-ca" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707355 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f559dfa3-3917-43a2-97f6-61ddfda10e93" volumeName="kubernetes.io/projected/f559dfa3-3917-43a2-97f6-61ddfda10e93-kube-api-access-hm9x7" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707366 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09cfa50b-4138-4585-a53e-64dd3ab73335" volumeName="kubernetes.io/projected/09cfa50b-4138-4585-a53e-64dd3ab73335-kube-api-access-zsb9b" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707375 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" volumeName="kubernetes.io/empty-dir/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-ca-trust-extracted-pem" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707385 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cc85e424-18b2-4924-920b-bd291a8c4b01" volumeName="kubernetes.io/empty-dir/cc85e424-18b2-4924-920b-bd291a8c4b01-utilities" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707401 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" volumeName="kubernetes.io/configmap/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707411 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31fa8943-81cc-4750-a0b7-0fa9ab5af883" volumeName="kubernetes.io/empty-dir/31fa8943-81cc-4750-a0b7-0fa9ab5af883-catalog-content" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707420 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="593a3561-7760-45c5-8f91-5aaef7475d0f" volumeName="kubernetes.io/projected/593a3561-7760-45c5-8f91-5aaef7475d0f-kube-api-access-sbc2l" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707430 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d7e8f42f-dc0e-424b-bb56-5ec849834888" volumeName="kubernetes.io/secret/d7e8f42f-dc0e-424b-bb56-5ec849834888-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707439 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" volumeName="kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-service-ca-bundle" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707448 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f65c0ac1-8bca-454d-a2e6-e35cb418beac" volumeName="kubernetes.io/projected/f65c0ac1-8bca-454d-a2e6-e35cb418beac-kube-api-access" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707461 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d565531a-ff86-4608-9d19-767de01ac31b" volumeName="kubernetes.io/secret/d565531a-ff86-4608-9d19-767de01ac31b-proxy-tls" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707471 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f559dfa3-3917-43a2-97f6-61ddfda10e93" volumeName="kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-trusted-ca-bundle" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707480 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707490 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="92dfbade-90b6-4169-8c07-72cff7f2c82b" volumeName="kubernetes.io/secret/92dfbade-90b6-4169-8c07-72cff7f2c82b-metrics-tls" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707500 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9e9b5059-1b3e-4067-a63d-2952cbe863af" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711315 5116 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b1264ac67579ad07e7e9003054d44fe40dd55285a4b2f7dc74e48be1aee0868a/globalmount" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711393 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9f0bc7fcb0822a2c13eb2d22cd8c0641" volumeName="kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-var-run-kubernetes" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711423 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b605f283-6f2e-42da-a838-54421690f7d0" volumeName="kubernetes.io/empty-dir/b605f283-6f2e-42da-a838-54421690f7d0-utilities" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711438 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" volumeName="kubernetes.io/projected/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-bound-sa-token" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711453 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="94a6e063-3d1a-4d44-875d-185291448c31" volumeName="kubernetes.io/empty-dir/94a6e063-3d1a-4d44-875d-185291448c31-utilities" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711467 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="18f80adb-c1c3-49ba-8ee4-932c851d3897" volumeName="kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-stats-auth" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711479 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="567683bd-0efc-4f21-b076-e28559628404" volumeName="kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711490 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="567683bd-0efc-4f21-b076-e28559628404" volumeName="kubernetes.io/secret/567683bd-0efc-4f21-b076-e28559628404-etcd-client" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711501 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711514 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="94a6e063-3d1a-4d44-875d-185291448c31" volumeName="kubernetes.io/projected/94a6e063-3d1a-4d44-875d-185291448c31-kube-api-access-4hb7m" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711524 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9e9b5059-1b3e-4067-a63d-2952cbe863af" volumeName="kubernetes.io/configmap/9e9b5059-1b3e-4067-a63d-2952cbe863af-trusted-ca" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711537 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="34177974-8d82-49d2-a763-391d0df3bbd8" volumeName="kubernetes.io/secret/34177974-8d82-49d2-a763-391d0df3bbd8-metrics-tls" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711549 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01080b46-74f1-4191-8755-5152a57b3b25" volumeName="kubernetes.io/secret/01080b46-74f1-4191-8755-5152a57b3b25-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711562 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="18f80adb-c1c3-49ba-8ee4-932c851d3897" volumeName="kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-default-certificate" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711577 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a7a88189-c967-4640-879e-27665747f20c" volumeName="kubernetes.io/empty-dir/a7a88189-c967-4640-879e-27665747f20c-tmpfs" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711596 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" volumeName="kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-oauth-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711610 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cc85e424-18b2-4924-920b-bd291a8c4b01" volumeName="kubernetes.io/empty-dir/cc85e424-18b2-4924-920b-bd291a8c4b01-catalog-content" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711638 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" volumeName="kubernetes.io/projected/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-kube-api-access-tkdh6" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711649 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c491984c-7d4b-44aa-8c1e-d7974424fa47" volumeName="kubernetes.io/configmap/c491984c-7d4b-44aa-8c1e-d7974424fa47-images" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711670 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c5f2bfad-70f6-4185-a3d9-81ce12720767" volumeName="kubernetes.io/configmap/c5f2bfad-70f6-4185-a3d9-81ce12720767-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711682 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d565531a-ff86-4608-9d19-767de01ac31b" volumeName="kubernetes.io/configmap/d565531a-ff86-4608-9d19-767de01ac31b-images" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711697 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f559dfa3-3917-43a2-97f6-61ddfda10e93" volumeName="kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-etcd-client" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711708 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f559dfa3-3917-43a2-97f6-61ddfda10e93" volumeName="kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711720 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" volumeName="kubernetes.io/projected/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-kube-api-access-l9stx" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711733 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="81e39f7b-62e4-4fc9-992a-6535ce127a02" volumeName="kubernetes.io/projected/81e39f7b-62e4-4fc9-992a-6535ce127a02-kube-api-access-pllx6" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711744 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9f71a554-e414-4bc3-96d2-674060397afe" volumeName="kubernetes.io/projected/9f71a554-e414-4bc3-96d2-674060397afe-bound-sa-token" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711756 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a555ff2e-0be6-46d5-897d-863bb92ae2b3" volumeName="kubernetes.io/secret/a555ff2e-0be6-46d5-897d-863bb92ae2b3-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711769 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="301e1965-1754-483d-b6cc-bfae7038bbca" volumeName="kubernetes.io/projected/301e1965-1754-483d-b6cc-bfae7038bbca-kube-api-access-7jjkz" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711781 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="567683bd-0efc-4f21-b076-e28559628404" volumeName="kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-etcd-ca" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711793 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="593a3561-7760-45c5-8f91-5aaef7475d0f" volumeName="kubernetes.io/secret/593a3561-7760-45c5-8f91-5aaef7475d0f-certs" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711804 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711816 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="81e39f7b-62e4-4fc9-992a-6535ce127a02" volumeName="kubernetes.io/configmap/81e39f7b-62e4-4fc9-992a-6535ce127a02-cni-binary-copy" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711829 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e1d2a42d-af1d-4054-9618-ab545e0ed8b7" volumeName="kubernetes.io/projected/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-kube-api-access-9z4sw" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711841 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" volumeName="kubernetes.io/projected/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-kube-api-access-ddlk9" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711855 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711868 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7afa918d-be67-40a6-803c-d3b0ae99d815" volumeName="kubernetes.io/configmap/7afa918d-be67-40a6-803c-d3b0ae99d815-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711880 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7df94c10-441d-4386-93a6-6730fb7bcde0" volumeName="kubernetes.io/projected/7df94c10-441d-4386-93a6-6730fb7bcde0-kube-api-access-nmmzf" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711892 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9f71a554-e414-4bc3-96d2-674060397afe" volumeName="kubernetes.io/secret/9f71a554-e414-4bc3-96d2-674060397afe-metrics-tls" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711902 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e1d2a42d-af1d-4054-9618-ab545e0ed8b7" volumeName="kubernetes.io/secret/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-proxy-tls" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711914 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af41de71-79cf-4590-bbe9-9e8b848862cb" volumeName="kubernetes.io/projected/af41de71-79cf-4590-bbe9-9e8b848862cb-kube-api-access-d7cps" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711926 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="16bdd140-dce1-464c-ab47-dd5798d1d256" volumeName="kubernetes.io/secret/16bdd140-dce1-464c-ab47-dd5798d1d256-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711943 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9e9b5059-1b3e-4067-a63d-2952cbe863af" volumeName="kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-bound-sa-token" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711955 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7599e0b6-bddf-4def-b7f2-0b32206e8651" volumeName="kubernetes.io/projected/7599e0b6-bddf-4def-b7f2-0b32206e8651-kube-api-access-ptkcf" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711969 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="149b3c48-e17c-4a66-a835-d86dabf6ff13" volumeName="kubernetes.io/empty-dir/149b3c48-e17c-4a66-a835-d86dabf6ff13-catalog-content" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711983 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="301e1965-1754-483d-b6cc-bfae7038bbca" volumeName="kubernetes.io/secret/301e1965-1754-483d-b6cc-bfae7038bbca-profile-collector-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711995 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31fa8943-81cc-4750-a0b7-0fa9ab5af883" volumeName="kubernetes.io/empty-dir/31fa8943-81cc-4750-a0b7-0fa9ab5af883-utilities" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712006 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b4750666-1362-4001-abd0-6f89964cc621" volumeName="kubernetes.io/configmap/b4750666-1362-4001-abd0-6f89964cc621-mcc-auth-proxy-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712018 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" volumeName="kubernetes.io/empty-dir/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-tmp" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712031 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4" volumeName="kubernetes.io/projected/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4-kube-api-access-pgx6b" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712045 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc8db2c7-859d-47b3-a900-2bd0c0b2973b" volumeName="kubernetes.io/configmap/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712056 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01080b46-74f1-4191-8755-5152a57b3b25" volumeName="kubernetes.io/projected/01080b46-74f1-4191-8755-5152a57b3b25-kube-api-access-w94wk" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712070 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-encryption-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712098 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc4541ce-7789-4670-bc75-5c2868e52ce0" volumeName="kubernetes.io/configmap/fc4541ce-7789-4670-bc75-5c2868e52ce0-ovnkube-identity-cm" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712116 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="736c54fe-349c-4bb9-870a-d1c1d1c03831" volumeName="kubernetes.io/configmap/736c54fe-349c-4bb9-870a-d1c1d1c03831-client-ca" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712126 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9e9b5059-1b3e-4067-a63d-2952cbe863af" volumeName="kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-kube-api-access-ws8zz" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712136 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" volumeName="kubernetes.io/projected/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-kube-api-access-5lcfw" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712148 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f65c0ac1-8bca-454d-a2e6-e35cb418beac" volumeName="kubernetes.io/configmap/f65c0ac1-8bca-454d-a2e6-e35cb418beac-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712182 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6077b63e-53a2-4f96-9d56-1ce0324e4913" volumeName="kubernetes.io/projected/6077b63e-53a2-4f96-9d56-1ce0324e4913-kube-api-access-zth6t" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712206 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="869851b9-7ffb-4af0-b166-1d8aa40a5f80" volumeName="kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-whereabouts-flatfile-configmap" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712225 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a208c9c2-333b-4b4a-be0d-bc32ec38a821" volumeName="kubernetes.io/secret/a208c9c2-333b-4b4a-be0d-bc32ec38a821-package-server-manager-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712235 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" volumeName="kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712245 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="584e1f4a-8205-47d7-8efb-3afc6017c4c9" volumeName="kubernetes.io/empty-dir/584e1f4a-8205-47d7-8efb-3afc6017c4c9-utilities" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712256 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" volumeName="kubernetes.io/projected/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-kube-api-access-dztfv" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712267 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="567683bd-0efc-4f21-b076-e28559628404" volumeName="kubernetes.io/projected/567683bd-0efc-4f21-b076-e28559628404-kube-api-access-m26jq" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712276 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6077b63e-53a2-4f96-9d56-1ce0324e4913" volumeName="kubernetes.io/secret/6077b63e-53a2-4f96-9d56-1ce0324e4913-metrics-tls" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712286 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7df94c10-441d-4386-93a6-6730fb7bcde0" volumeName="kubernetes.io/configmap/7df94c10-441d-4386-93a6-6730fb7bcde0-ovnkube-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712299 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09cfa50b-4138-4585-a53e-64dd3ab73335" volumeName="kubernetes.io/secret/09cfa50b-4138-4585-a53e-64dd3ab73335-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712309 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="42a11a02-47e1-488f-b270-2679d3298b0e" volumeName="kubernetes.io/projected/42a11a02-47e1-488f-b270-2679d3298b0e-kube-api-access-qgrkj" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712388 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5ebfebf6-3ecd-458e-943f-bb25b52e2718" volumeName="kubernetes.io/configmap/5ebfebf6-3ecd-458e-943f-bb25b52e2718-serviceca" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712400 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="736c54fe-349c-4bb9-870a-d1c1d1c03831" volumeName="kubernetes.io/configmap/736c54fe-349c-4bb9-870a-d1c1d1c03831-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712410 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" volumeName="kubernetes.io/secret/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-srv-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712419 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9e9b5059-1b3e-4067-a63d-2952cbe863af" volumeName="kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-registry-tls" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712430 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a52afe44-fb37-46ed-a1f8-bf39727a3cbe" volumeName="kubernetes.io/secret/a52afe44-fb37-46ed-a1f8-bf39727a3cbe-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712441 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af33e427-6803-48c2-a76a-dd9deb7cbf9a" volumeName="kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovnkube-script-lib" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712452 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-audit" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712463 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0dd0fbac-8c0d-4228-8faa-abbeedabf7db" volumeName="kubernetes.io/projected/0dd0fbac-8c0d-4228-8faa-abbeedabf7db-kube-api-access-q4smf" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712473 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" volumeName="kubernetes.io/secret/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-image-registry-operator-tls" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712482 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" volumeName="kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-trusted-ca-bundle" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712493 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-login" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712502 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c491984c-7d4b-44aa-8c1e-d7974424fa47" volumeName="kubernetes.io/projected/c491984c-7d4b-44aa-8c1e-d7974424fa47-kube-api-access-9vsz9" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712512 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" volumeName="kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-trusted-ca-bundle" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712522 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-error" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712532 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a7a88189-c967-4640-879e-27665747f20c" volumeName="kubernetes.io/projected/a7a88189-c967-4640-879e-27665747f20c-kube-api-access-8nspp" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712542 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f65c0ac1-8bca-454d-a2e6-e35cb418beac" volumeName="kubernetes.io/empty-dir/f65c0ac1-8bca-454d-a2e6-e35cb418beac-tmp-dir" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712552 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f863fff9-286a-45fa-b8f0-8a86994b8440" volumeName="kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712563 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc8db2c7-859d-47b3-a900-2bd0c0b2973b" volumeName="kubernetes.io/secret/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-machine-approver-tls" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712574 5116 reconstruct.go:97] "Volume reconstruction finished" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712582 5116 reconciler.go:26] "Reconciler: start to sync state" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.719458 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.731570 5116 manager.go:341] "Starting Device Plugin manager" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.731828 5116 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.731850 5116 server.go:85] "Starting device plugin registration server" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.732281 5116 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.732301 5116 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.732482 5116 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.732587 5116 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.732602 5116 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.736533 5116 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.736601 5116 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.797907 5116 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.798129 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.798777 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.798829 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.798845 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.799608 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.799768 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.799813 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.800194 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.800221 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.800233 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.800265 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.800284 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.800294 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.801253 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.801325 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.801358 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.802409 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.802445 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.802481 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.802444 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.802572 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.802586 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.803469 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.803600 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.803651 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.804196 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.804236 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.804251 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.804208 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.804281 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.804295 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.805515 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.805547 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.805664 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.806149 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.806190 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.806203 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.808291 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.808339 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.808352 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.809057 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.809197 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.810500 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.810523 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.810535 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.819612 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="400ms" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.831290 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.832606 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.833415 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.833461 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.833474 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.833500 5116 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.834043 5116 kubelet_node_status.go:110] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.223:6443: connect: connection refused" node="crc" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.837608 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.859314 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.891640 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.897632 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.914856 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/9f0bc7fcb0822a2c13eb2d22cd8c0641-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.914919 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-static-pod-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.914957 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-usr-local-bin\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.915598 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/20c5c5b4bed930554494851fe3cb2b2a-tmp-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.915645 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.915675 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/9f0bc7fcb0822a2c13eb2d22cd8c0641-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.915704 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4e08c320b1e9e2405e6e0107bdf7eeb4-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"4e08c320b1e9e2405e6e0107bdf7eeb4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.915738 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e08c320b1e9e2405e6e0107bdf7eeb4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"4e08c320b1e9e2405e6e0107bdf7eeb4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.915767 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-data-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.915798 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-bundle-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-ca-bundle-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.915833 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-tmp-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.915879 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/0b638b8f4bb0070e40528db779baf6a2-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"0b638b8f4bb0070e40528db779baf6a2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.915924 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-cert-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.915955 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-dir\" (UniqueName: \"kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-ca-trust-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.915985 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-run-kubernetes\" (UniqueName: \"kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-var-run-kubernetes\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.916021 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0b638b8f4bb0070e40528db779baf6a2-tmp\") pod \"openshift-kube-scheduler-crc\" (UID: \"0b638b8f4bb0070e40528db779baf6a2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.916052 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/0b638b8f4bb0070e40528db779baf6a2-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"0b638b8f4bb0070e40528db779baf6a2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.916080 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-resource-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.916150 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-log-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.916202 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-auto-backup-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-etcd-auto-backup-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.916238 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.916271 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.916305 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-tmp-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.917185 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-run-kubernetes\" (UniqueName: \"kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-var-run-kubernetes\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.917216 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-tmp-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.917297 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-bundle-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-ca-bundle-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.917763 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-tmp-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.917936 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0b638b8f4bb0070e40528db779baf6a2-tmp\") pod \"openshift-kube-scheduler-crc\" (UID: \"0b638b8f4bb0070e40528db779baf6a2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.917948 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-dir\" (UniqueName: \"kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-ca-trust-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.918308 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/20c5c5b4bed930554494851fe3cb2b2a-tmp-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.017820 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-resource-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.017893 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-log-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.017916 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etcd-auto-backup-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-etcd-auto-backup-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.017989 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etcd-auto-backup-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-etcd-auto-backup-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.018060 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-log-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.018061 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-resource-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.018132 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.018270 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.018321 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/9f0bc7fcb0822a2c13eb2d22cd8c0641-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.018365 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-static-pod-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.018404 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-usr-local-bin\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.018445 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.018489 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/9f0bc7fcb0822a2c13eb2d22cd8c0641-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.018532 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4e08c320b1e9e2405e6e0107bdf7eeb4-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"4e08c320b1e9e2405e6e0107bdf7eeb4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.018575 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e08c320b1e9e2405e6e0107bdf7eeb4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"4e08c320b1e9e2405e6e0107bdf7eeb4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.018614 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-data-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.018666 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/0b638b8f4bb0070e40528db779baf6a2-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"0b638b8f4bb0070e40528db779baf6a2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.018708 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-cert-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.018760 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/0b638b8f4bb0070e40528db779baf6a2-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"0b638b8f4bb0070e40528db779baf6a2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.018941 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.018209 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.019039 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.019069 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/9f0bc7fcb0822a2c13eb2d22cd8c0641-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.019095 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-static-pod-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.019118 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-usr-local-bin\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.019151 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-data-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.019194 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/9f0bc7fcb0822a2c13eb2d22cd8c0641-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.019219 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4e08c320b1e9e2405e6e0107bdf7eeb4-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"4e08c320b1e9e2405e6e0107bdf7eeb4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.019241 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e08c320b1e9e2405e6e0107bdf7eeb4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"4e08c320b1e9e2405e6e0107bdf7eeb4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.019266 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-cert-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.019289 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/0b638b8f4bb0070e40528db779baf6a2-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"0b638b8f4bb0070e40528db779baf6a2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.019353 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/0b638b8f4bb0070e40528db779baf6a2-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"0b638b8f4bb0070e40528db779baf6a2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.034916 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.036143 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.036266 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.036296 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.036352 5116 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 22 00:08:50 crc kubenswrapper[5116]: E0322 00:08:50.036976 5116 kubelet_node_status.go:110] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.223:6443: connect: connection refused" node="crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.132426 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.139760 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.160613 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: W0322 00:08:50.184262 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a14caf222afb62aaabdc47808b6f944.slice/crio-53687b947ea38ea8f6bbba0ae3084b15a74f7b9470d203e9c04de68b3a834955 WatchSource:0}: Error finding container 53687b947ea38ea8f6bbba0ae3084b15a74f7b9470d203e9c04de68b3a834955: Status 404 returned error can't find the container with id 53687b947ea38ea8f6bbba0ae3084b15a74f7b9470d203e9c04de68b3a834955 Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.188826 5116 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 22 00:08:50 crc kubenswrapper[5116]: W0322 00:08:50.190301 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f0bc7fcb0822a2c13eb2d22cd8c0641.slice/crio-377972f7d316901eb92f035066dcfd94a42b91254a72994084babc2138a26888 WatchSource:0}: Error finding container 377972f7d316901eb92f035066dcfd94a42b91254a72994084babc2138a26888: Status 404 returned error can't find the container with id 377972f7d316901eb92f035066dcfd94a42b91254a72994084babc2138a26888 Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.191993 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.198526 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: W0322 00:08:50.198660 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b638b8f4bb0070e40528db779baf6a2.slice/crio-b9dfc92772583ec2aaf8b8d9e25d76eeed90c5b85fc9509472a19a461be9b22c WatchSource:0}: Error finding container b9dfc92772583ec2aaf8b8d9e25d76eeed90c5b85fc9509472a19a461be9b22c: Status 404 returned error can't find the container with id b9dfc92772583ec2aaf8b8d9e25d76eeed90c5b85fc9509472a19a461be9b22c Mar 22 00:08:50 crc kubenswrapper[5116]: W0322 00:08:50.213099 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e08c320b1e9e2405e6e0107bdf7eeb4.slice/crio-30a37cba6f80744af79a725a6d8345a102fb2ace6d12b7898310394ec8d6b7e0 WatchSource:0}: Error finding container 30a37cba6f80744af79a725a6d8345a102fb2ace6d12b7898310394ec8d6b7e0: Status 404 returned error can't find the container with id 30a37cba6f80744af79a725a6d8345a102fb2ace6d12b7898310394ec8d6b7e0 Mar 22 00:08:50 crc kubenswrapper[5116]: W0322 00:08:50.217056 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20c5c5b4bed930554494851fe3cb2b2a.slice/crio-1e8e20269b5c698e1ce97359ff99cf27c22b371c2d789b4e8fa1f32a418dd0ee WatchSource:0}: Error finding container 1e8e20269b5c698e1ce97359ff99cf27c22b371c2d789b4e8fa1f32a418dd0ee: Status 404 returned error can't find the container with id 1e8e20269b5c698e1ce97359ff99cf27c22b371c2d789b4e8fa1f32a418dd0ee Mar 22 00:08:50 crc kubenswrapper[5116]: E0322 00:08:50.220736 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="800ms" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.437816 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.438682 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.438740 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.438756 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.438777 5116 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 22 00:08:50 crc kubenswrapper[5116]: E0322 00:08:50.439107 5116 kubelet_node_status.go:110] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.223:6443: connect: connection refused" node="crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.595834 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.701545 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerStarted","Data":"1e8e20269b5c698e1ce97359ff99cf27c22b371c2d789b4e8fa1f32a418dd0ee"} Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.704111 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"4e08c320b1e9e2405e6e0107bdf7eeb4","Type":"ContainerStarted","Data":"30a37cba6f80744af79a725a6d8345a102fb2ace6d12b7898310394ec8d6b7e0"} Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.705461 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"0b638b8f4bb0070e40528db779baf6a2","Type":"ContainerStarted","Data":"b9dfc92772583ec2aaf8b8d9e25d76eeed90c5b85fc9509472a19a461be9b22c"} Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.709070 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"9f0bc7fcb0822a2c13eb2d22cd8c0641","Type":"ContainerStarted","Data":"377972f7d316901eb92f035066dcfd94a42b91254a72994084babc2138a26888"} Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.710585 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"53687b947ea38ea8f6bbba0ae3084b15a74f7b9470d203e9c04de68b3a834955"} Mar 22 00:08:50 crc kubenswrapper[5116]: E0322 00:08:50.748194 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 22 00:08:51 crc kubenswrapper[5116]: E0322 00:08:51.021990 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="1.6s" Mar 22 00:08:51 crc kubenswrapper[5116]: E0322 00:08:51.047298 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 22 00:08:51 crc kubenswrapper[5116]: E0322 00:08:51.112362 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 22 00:08:51 crc kubenswrapper[5116]: E0322 00:08:51.144247 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.240141 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.241789 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.241842 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.241860 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.241887 5116 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 22 00:08:51 crc kubenswrapper[5116]: E0322 00:08:51.242366 5116 kubelet_node_status.go:110] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.223:6443: connect: connection refused" node="crc" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.595295 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.600768 5116 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 22 00:08:51 crc kubenswrapper[5116]: E0322 00:08:51.601932 5116 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.714565 5116 generic.go:358] "Generic (PLEG): container finished" podID="20c5c5b4bed930554494851fe3cb2b2a" containerID="18580e6f77485dbbf6c9384d7b066de66b1281944199ad05eb32241a77f1b8db" exitCode=0 Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.714624 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerDied","Data":"18580e6f77485dbbf6c9384d7b066de66b1281944199ad05eb32241a77f1b8db"} Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.714862 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.715567 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.715597 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.715607 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:51 crc kubenswrapper[5116]: E0322 00:08:51.715885 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.717134 5116 generic.go:358] "Generic (PLEG): container finished" podID="4e08c320b1e9e2405e6e0107bdf7eeb4" containerID="cf2bdb501428b5446a15124d06a58c58c7cfb52a6b41e975ff1b3c08894e0cff" exitCode=0 Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.717178 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"4e08c320b1e9e2405e6e0107bdf7eeb4","Type":"ContainerDied","Data":"cf2bdb501428b5446a15124d06a58c58c7cfb52a6b41e975ff1b3c08894e0cff"} Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.717347 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.719040 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.719081 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.719104 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:51 crc kubenswrapper[5116]: E0322 00:08:51.719406 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.720455 5116 generic.go:358] "Generic (PLEG): container finished" podID="0b638b8f4bb0070e40528db779baf6a2" containerID="1543246d672acb399d96f28680b91e7dcf741cbb8ce21363dd09dec6dad2687c" exitCode=0 Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.720527 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"0b638b8f4bb0070e40528db779baf6a2","Type":"ContainerDied","Data":"1543246d672acb399d96f28680b91e7dcf741cbb8ce21363dd09dec6dad2687c"} Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.720686 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.721578 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.721612 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.721621 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:51 crc kubenswrapper[5116]: E0322 00:08:51.721776 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.725207 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"9f0bc7fcb0822a2c13eb2d22cd8c0641","Type":"ContainerStarted","Data":"497485a3b5940835bde037d81157c05ecaaa45b09bdffff76bbc038694f328d5"} Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.725275 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"9f0bc7fcb0822a2c13eb2d22cd8c0641","Type":"ContainerStarted","Data":"2d70ad1aa005302b2a06b86cd4b8f4df7506345c8e11f68b51f077be50023120"} Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.725292 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"9f0bc7fcb0822a2c13eb2d22cd8c0641","Type":"ContainerStarted","Data":"05d6ab41bff1491b16d13fd321fc3f2ed79784b5945966cfa37735f570281b54"} Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.726663 5116 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="15a3fa2ea4a685791c1b819448d6d0952ea97b7caf1a51b4250746e17e743cc5" exitCode=0 Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.726694 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerDied","Data":"15a3fa2ea4a685791c1b819448d6d0952ea97b7caf1a51b4250746e17e743cc5"} Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.726821 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.727339 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.727384 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.727396 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:51 crc kubenswrapper[5116]: E0322 00:08:51.727595 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.730134 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.730727 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.730769 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.730782 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.595864 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 22 00:08:52 crc kubenswrapper[5116]: E0322 00:08:52.624029 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="3.2s" Mar 22 00:08:52 crc kubenswrapper[5116]: E0322 00:08:52.719513 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.734976 5116 generic.go:358] "Generic (PLEG): container finished" podID="20c5c5b4bed930554494851fe3cb2b2a" containerID="b8c90b6c961bb202b828dc8ac3d4da1dd89cbe82a5400089a87594b9666234de" exitCode=0 Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.735047 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerDied","Data":"b8c90b6c961bb202b828dc8ac3d4da1dd89cbe82a5400089a87594b9666234de"} Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.735359 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.736258 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.736288 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.736298 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:52 crc kubenswrapper[5116]: E0322 00:08:52.736500 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.739075 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"4e08c320b1e9e2405e6e0107bdf7eeb4","Type":"ContainerStarted","Data":"52601b921664be4e939b691ab6a7a52e7865f01815701526f277de4bf21520e4"} Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.739184 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.740697 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.740721 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.740802 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:52 crc kubenswrapper[5116]: E0322 00:08:52.741098 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.743287 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"0b638b8f4bb0070e40528db779baf6a2","Type":"ContainerStarted","Data":"609c703ea4eb165b2fdc69a88571190199d2dd374ea3629e50d7e6091c4552b5"} Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.743317 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"0b638b8f4bb0070e40528db779baf6a2","Type":"ContainerStarted","Data":"8c2b67442f55addc55a7f281f1b6a6441e1d6068e6f826dedb686cdc29ef2ec1"} Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.743328 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"0b638b8f4bb0070e40528db779baf6a2","Type":"ContainerStarted","Data":"5f2712849d12be66837cdb73e0858adfc9b9172ec8bbe1d83c7fe3fcc4bc8fe7"} Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.743373 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.744102 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.744241 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.744269 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:52 crc kubenswrapper[5116]: E0322 00:08:52.744674 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.746450 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"9f0bc7fcb0822a2c13eb2d22cd8c0641","Type":"ContainerStarted","Data":"dac49ea656d097873f0dcd29b532de929dd705148709a0184938669a24b2d0b1"} Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.746544 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.747257 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.747288 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.747297 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:52 crc kubenswrapper[5116]: E0322 00:08:52.747460 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.753549 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"b24be2af5c7a78bb1d324b802332ff3620ee459e00164b6574221c5186689456"} Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.753582 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"8b5eb8790cbf9c748b6973a9f5ce75637e41f035ed3bdd5eda970498f8d57bdb"} Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.753593 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"d4b3378494f7debf7a1df1eb70ba9f5b687fd897b206a12e1eb127da23e5830b"} Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.753605 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"ad6827aa53ec071d573f2851cceb4fb83950ed3acf710fdb8da0db4f5143d5e7"} Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.842949 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.843741 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.843775 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.843815 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.843874 5116 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 22 00:08:52 crc kubenswrapper[5116]: E0322 00:08:52.844360 5116 kubelet_node_status.go:110] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.223:6443: connect: connection refused" node="crc" Mar 22 00:08:53 crc kubenswrapper[5116]: E0322 00:08:53.085524 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.756708 5116 generic.go:358] "Generic (PLEG): container finished" podID="20c5c5b4bed930554494851fe3cb2b2a" containerID="c9fbf4e89cefa264fde65048c1a9bba1b6eafe0e57717eb8d9a23e9a445b8195" exitCode=0 Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.756792 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerDied","Data":"c9fbf4e89cefa264fde65048c1a9bba1b6eafe0e57717eb8d9a23e9a445b8195"} Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.756941 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.757477 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.757519 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.757532 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:53 crc kubenswrapper[5116]: E0322 00:08:53.757703 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.761589 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"e2c881c727397234f9ea0049cb8b50cf8c351a23d8feaf545b93e1a981e673d6"} Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.763056 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.763891 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.763918 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.764612 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.764656 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.766517 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.766557 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.766572 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.766730 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.766816 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.766924 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.767012 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.767037 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.767047 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.766517 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.767077 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.767087 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:53 crc kubenswrapper[5116]: E0322 00:08:53.766946 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:53 crc kubenswrapper[5116]: E0322 00:08:53.767572 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:53 crc kubenswrapper[5116]: E0322 00:08:53.767804 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:53 crc kubenswrapper[5116]: E0322 00:08:53.768288 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.540939 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.766911 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.767274 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.767390 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerStarted","Data":"bfe93ca3eb874b172d7e5a437f62f0a3794ec8d1de985eadc78f9c5dc076e580"} Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.767416 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerStarted","Data":"79bb9087b191778249a1a55111db0643e3b60d19f757f975df02d11b33c9c101"} Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.767424 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerStarted","Data":"2600f5226a6713d12505ffd457e56ebd7d088161c72b9c8559bbea8a975297de"} Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.767433 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerStarted","Data":"5c4690216bec52133687895e582b4bb77c5e98aa64b321ad9c9d506f55d3f00e"} Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.767440 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerStarted","Data":"88776b5f47aa00647e827ac8c99416f29940ce9d6589329425a92597ab26ec51"} Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.767536 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.767949 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.767985 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.767998 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.768153 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.768212 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.768224 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.768223 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.768257 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.768271 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:54 crc kubenswrapper[5116]: E0322 00:08:54.768279 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:54 crc kubenswrapper[5116]: E0322 00:08:54.768490 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:54 crc kubenswrapper[5116]: E0322 00:08:54.768821 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:55 crc kubenswrapper[5116]: I0322 00:08:55.112319 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:08:55 crc kubenswrapper[5116]: I0322 00:08:55.112540 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:55 crc kubenswrapper[5116]: I0322 00:08:55.113739 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:55 crc kubenswrapper[5116]: I0322 00:08:55.113790 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:55 crc kubenswrapper[5116]: I0322 00:08:55.113801 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:55 crc kubenswrapper[5116]: E0322 00:08:55.114117 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:55 crc kubenswrapper[5116]: I0322 00:08:55.705235 5116 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 22 00:08:55 crc kubenswrapper[5116]: I0322 00:08:55.769910 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:55 crc kubenswrapper[5116]: I0322 00:08:55.769910 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:55 crc kubenswrapper[5116]: I0322 00:08:55.770939 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:55 crc kubenswrapper[5116]: I0322 00:08:55.770990 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:55 crc kubenswrapper[5116]: I0322 00:08:55.771010 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:55 crc kubenswrapper[5116]: I0322 00:08:55.771025 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:55 crc kubenswrapper[5116]: I0322 00:08:55.771038 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:55 crc kubenswrapper[5116]: I0322 00:08:55.771048 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:55 crc kubenswrapper[5116]: E0322 00:08:55.771539 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:55 crc kubenswrapper[5116]: E0322 00:08:55.772133 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:56 crc kubenswrapper[5116]: I0322 00:08:56.044832 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:56 crc kubenswrapper[5116]: I0322 00:08:56.046122 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:56 crc kubenswrapper[5116]: I0322 00:08:56.046227 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:56 crc kubenswrapper[5116]: I0322 00:08:56.046244 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:56 crc kubenswrapper[5116]: I0322 00:08:56.046281 5116 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 22 00:08:56 crc kubenswrapper[5116]: I0322 00:08:56.064351 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:08:56 crc kubenswrapper[5116]: I0322 00:08:56.064696 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:56 crc kubenswrapper[5116]: I0322 00:08:56.066074 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:56 crc kubenswrapper[5116]: I0322 00:08:56.066144 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:56 crc kubenswrapper[5116]: I0322 00:08:56.066187 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:56 crc kubenswrapper[5116]: E0322 00:08:56.066685 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:57 crc kubenswrapper[5116]: I0322 00:08:57.018413 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:57 crc kubenswrapper[5116]: I0322 00:08:57.018722 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:57 crc kubenswrapper[5116]: I0322 00:08:57.019674 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:57 crc kubenswrapper[5116]: I0322 00:08:57.019740 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:57 crc kubenswrapper[5116]: I0322 00:08:57.019760 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:57 crc kubenswrapper[5116]: E0322 00:08:57.020277 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:57 crc kubenswrapper[5116]: I0322 00:08:57.880981 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:57 crc kubenswrapper[5116]: I0322 00:08:57.881363 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:57 crc kubenswrapper[5116]: I0322 00:08:57.882546 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:57 crc kubenswrapper[5116]: I0322 00:08:57.882599 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:57 crc kubenswrapper[5116]: I0322 00:08:57.882612 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:57 crc kubenswrapper[5116]: E0322 00:08:57.883091 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:58 crc kubenswrapper[5116]: I0322 00:08:58.113574 5116 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": context deadline exceeded" start-of-body= Mar 22 00:08:58 crc kubenswrapper[5116]: I0322 00:08:58.113674 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="9f0bc7fcb0822a2c13eb2d22cd8c0641" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": context deadline exceeded" Mar 22 00:08:58 crc kubenswrapper[5116]: I0322 00:08:58.874506 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-etcd/etcd-crc" Mar 22 00:08:58 crc kubenswrapper[5116]: I0322 00:08:58.874744 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:58 crc kubenswrapper[5116]: I0322 00:08:58.875645 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:58 crc kubenswrapper[5116]: I0322 00:08:58.875753 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:58 crc kubenswrapper[5116]: I0322 00:08:58.875778 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:58 crc kubenswrapper[5116]: E0322 00:08:58.876703 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:59 crc kubenswrapper[5116]: E0322 00:08:59.737357 5116 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 22 00:09:00 crc kubenswrapper[5116]: I0322 00:09:00.489593 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:09:00 crc kubenswrapper[5116]: I0322 00:09:00.489901 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:00 crc kubenswrapper[5116]: I0322 00:09:00.490815 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:00 crc kubenswrapper[5116]: I0322 00:09:00.490864 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:00 crc kubenswrapper[5116]: I0322 00:09:00.490880 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:00 crc kubenswrapper[5116]: E0322 00:09:00.491306 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:01 crc kubenswrapper[5116]: I0322 00:09:01.703755 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:09:01 crc kubenswrapper[5116]: I0322 00:09:01.704321 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:01 crc kubenswrapper[5116]: I0322 00:09:01.705233 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:01 crc kubenswrapper[5116]: I0322 00:09:01.705312 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:01 crc kubenswrapper[5116]: I0322 00:09:01.705333 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:01 crc kubenswrapper[5116]: E0322 00:09:01.705821 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:01 crc kubenswrapper[5116]: I0322 00:09:01.710959 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:09:01 crc kubenswrapper[5116]: I0322 00:09:01.788331 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:01 crc kubenswrapper[5116]: I0322 00:09:01.789220 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:01 crc kubenswrapper[5116]: I0322 00:09:01.789298 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:01 crc kubenswrapper[5116]: I0322 00:09:01.789324 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:01 crc kubenswrapper[5116]: E0322 00:09:01.789978 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:01 crc kubenswrapper[5116]: I0322 00:09:01.794075 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:09:01 crc kubenswrapper[5116]: I0322 00:09:01.853715 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 22 00:09:01 crc kubenswrapper[5116]: I0322 00:09:01.853992 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:01 crc kubenswrapper[5116]: I0322 00:09:01.854952 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:01 crc kubenswrapper[5116]: I0322 00:09:01.854986 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:01 crc kubenswrapper[5116]: I0322 00:09:01.854996 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:01 crc kubenswrapper[5116]: E0322 00:09:01.855348 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:02 crc kubenswrapper[5116]: I0322 00:09:02.790930 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:02 crc kubenswrapper[5116]: I0322 00:09:02.792108 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:02 crc kubenswrapper[5116]: I0322 00:09:02.792273 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:02 crc kubenswrapper[5116]: I0322 00:09:02.792307 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:02 crc kubenswrapper[5116]: E0322 00:09:02.793121 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:02 crc kubenswrapper[5116]: I0322 00:09:02.872240 5116 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 22 00:09:02 crc kubenswrapper[5116]: I0322 00:09:02.872318 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 22 00:09:03 crc kubenswrapper[5116]: I0322 00:09:03.547253 5116 trace.go:236] Trace[673956096]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Mar-2026 00:08:53.546) (total time: 10001ms): Mar 22 00:09:03 crc kubenswrapper[5116]: Trace[673956096]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (00:09:03.547) Mar 22 00:09:03 crc kubenswrapper[5116]: Trace[673956096]: [10.00103103s] [10.00103103s] END Mar 22 00:09:03 crc kubenswrapper[5116]: E0322 00:09:03.547319 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 22 00:09:03 crc kubenswrapper[5116]: I0322 00:09:03.596518 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 22 00:09:03 crc kubenswrapper[5116]: I0322 00:09:03.901251 5116 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 22 00:09:03 crc kubenswrapper[5116]: I0322 00:09:03.901318 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 22 00:09:03 crc kubenswrapper[5116]: I0322 00:09:03.906492 5116 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 22 00:09:03 crc kubenswrapper[5116]: I0322 00:09:03.906564 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 22 00:09:05 crc kubenswrapper[5116]: E0322 00:09:05.825548 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Mar 22 00:09:07 crc kubenswrapper[5116]: I0322 00:09:07.028730 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:09:07 crc kubenswrapper[5116]: I0322 00:09:07.028920 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:07 crc kubenswrapper[5116]: I0322 00:09:07.029890 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:07 crc kubenswrapper[5116]: I0322 00:09:07.029917 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:07 crc kubenswrapper[5116]: I0322 00:09:07.029926 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:07 crc kubenswrapper[5116]: E0322 00:09:07.030204 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:07 crc kubenswrapper[5116]: I0322 00:09:07.035559 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:09:07 crc kubenswrapper[5116]: I0322 00:09:07.803764 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:07 crc kubenswrapper[5116]: I0322 00:09:07.804440 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:07 crc kubenswrapper[5116]: I0322 00:09:07.804474 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:07 crc kubenswrapper[5116]: I0322 00:09:07.804493 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:07 crc kubenswrapper[5116]: E0322 00:09:07.805061 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:08 crc kubenswrapper[5116]: I0322 00:09:08.114019 5116 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 22 00:09:08 crc kubenswrapper[5116]: I0322 00:09:08.114139 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="9f0bc7fcb0822a2c13eb2d22cd8c0641" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.755763 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 22 00:09:08 crc kubenswrapper[5116]: I0322 00:09:08.899544 5116 trace.go:236] Trace[2115267700]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Mar-2026 00:08:56.443) (total time: 12455ms): Mar 22 00:09:08 crc kubenswrapper[5116]: Trace[2115267700]: ---"Objects listed" error:csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope 12455ms (00:09:08.899) Mar 22 00:09:08 crc kubenswrapper[5116]: Trace[2115267700]: [12.455651425s] [12.455651425s] END Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.899578 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 22 00:09:08 crc kubenswrapper[5116]: I0322 00:09:08.900643 5116 trace.go:236] Trace[2079725794]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Mar-2026 00:08:54.293) (total time: 14606ms): Mar 22 00:09:08 crc kubenswrapper[5116]: Trace[2079725794]: ---"Objects listed" error:nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope 14606ms (00:09:08.900) Mar 22 00:09:08 crc kubenswrapper[5116]: Trace[2079725794]: [14.606786418s] [14.606786418s] END Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.900680 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.900583 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa607e028 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.612529704 +0000 UTC m=+0.634831087,LastTimestamp:2026-03-22 00:08:49.612529704 +0000 UTC m=+0.634831087,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: I0322 00:09:08.903276 5116 trace.go:236] Trace[885013279]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Mar-2026 00:08:56.783) (total time: 12119ms): Mar 22 00:09:08 crc kubenswrapper[5116]: Trace[885013279]: ---"Objects listed" error:services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope 12119ms (00:09:08.903) Mar 22 00:09:08 crc kubenswrapper[5116]: Trace[885013279]: [12.119439895s] [12.119439895s] END Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.903308 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.904189 5116 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.904250 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98e7a87 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.671682695 +0000 UTC m=+0.693984068,LastTimestamp:2026-03-22 00:08:49.671682695 +0000 UTC m=+0.693984068,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.906505 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98ecffa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.67170457 +0000 UTC m=+0.694005943,LastTimestamp:2026-03-22 00:08:49.67170457 +0000 UTC m=+0.694005943,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.908518 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98ef810 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.671714832 +0000 UTC m=+0.694016205,LastTimestamp:2026-03-22 00:08:49.671714832 +0000 UTC m=+0.694016205,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: I0322 00:09:08.909986 5116 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.911047 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aad4f781c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.734662172 +0000 UTC m=+0.756963545,LastTimestamp:2026-03-22 00:08:49.734662172 +0000 UTC m=+0.756963545,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.913464 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98e7a87\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98e7a87 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.671682695 +0000 UTC m=+0.693984068,LastTimestamp:2026-03-22 00:08:49.798809755 +0000 UTC m=+0.821111128,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.915703 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98ecffa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98ecffa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.67170457 +0000 UTC m=+0.694005943,LastTimestamp:2026-03-22 00:08:49.798837721 +0000 UTC m=+0.821139094,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.918181 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98ef810\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98ef810 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.671714832 +0000 UTC m=+0.694016205,LastTimestamp:2026-03-22 00:08:49.798852044 +0000 UTC m=+0.821153417,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.920759 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98e7a87\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98e7a87 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.671682695 +0000 UTC m=+0.693984068,LastTimestamp:2026-03-22 00:08:49.800211118 +0000 UTC m=+0.822512491,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.926614 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98ecffa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98ecffa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.67170457 +0000 UTC m=+0.694005943,LastTimestamp:2026-03-22 00:08:49.800227142 +0000 UTC m=+0.822528515,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.928862 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98ef810\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98ef810 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.671714832 +0000 UTC m=+0.694016205,LastTimestamp:2026-03-22 00:08:49.800239674 +0000 UTC m=+0.822541047,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.934305 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98e7a87\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98e7a87 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.671682695 +0000 UTC m=+0.693984068,LastTimestamp:2026-03-22 00:08:49.800277022 +0000 UTC m=+0.822578395,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.941865 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98ecffa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98ecffa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.67170457 +0000 UTC m=+0.694005943,LastTimestamp:2026-03-22 00:08:49.800289224 +0000 UTC m=+0.822590597,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.946093 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98ef810\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98ef810 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.671714832 +0000 UTC m=+0.694016205,LastTimestamp:2026-03-22 00:08:49.800299876 +0000 UTC m=+0.822601249,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.950892 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98e7a87\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98e7a87 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.671682695 +0000 UTC m=+0.693984068,LastTimestamp:2026-03-22 00:08:49.802425547 +0000 UTC m=+0.824726920,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.955538 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98ecffa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98ecffa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.67170457 +0000 UTC m=+0.694005943,LastTimestamp:2026-03-22 00:08:49.802476998 +0000 UTC m=+0.824778371,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.960437 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98ef810\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98ef810 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.671714832 +0000 UTC m=+0.694016205,LastTimestamp:2026-03-22 00:08:49.80248712 +0000 UTC m=+0.824788493,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.965577 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98e7a87\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98e7a87 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.671682695 +0000 UTC m=+0.693984068,LastTimestamp:2026-03-22 00:08:49.802544762 +0000 UTC m=+0.824846135,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.970936 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98ecffa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98ecffa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.67170457 +0000 UTC m=+0.694005943,LastTimestamp:2026-03-22 00:08:49.802580859 +0000 UTC m=+0.824882242,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.976214 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98ef810\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98ef810 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.671714832 +0000 UTC m=+0.694016205,LastTimestamp:2026-03-22 00:08:49.802593171 +0000 UTC m=+0.824894544,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.981450 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98e7a87\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98e7a87 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.671682695 +0000 UTC m=+0.693984068,LastTimestamp:2026-03-22 00:08:49.804228082 +0000 UTC m=+0.826529455,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.986083 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98ecffa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98ecffa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.67170457 +0000 UTC m=+0.694005943,LastTimestamp:2026-03-22 00:08:49.804243305 +0000 UTC m=+0.826544688,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.991115 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98ef810\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98ef810 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.671714832 +0000 UTC m=+0.694016205,LastTimestamp:2026-03-22 00:08:49.804257558 +0000 UTC m=+0.826558931,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.995342 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98e7a87\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98e7a87 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.671682695 +0000 UTC m=+0.693984068,LastTimestamp:2026-03-22 00:08:49.804270421 +0000 UTC m=+0.826571794,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.998873 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98ecffa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98ecffa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.67170457 +0000 UTC m=+0.694005943,LastTimestamp:2026-03-22 00:08:49.804290034 +0000 UTC m=+0.826591408,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.003557 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013ac8670d06 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:50.189192454 +0000 UTC m=+1.211493837,LastTimestamp:2026-03-22 00:08:50.189192454 +0000 UTC m=+1.211493837,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.007001 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189f013ac8e58c67 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:50.197482599 +0000 UTC m=+1.219783972,LastTimestamp:2026-03-22 00:08:50.197482599 +0000 UTC m=+1.219783972,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.012335 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189f013ac9629a14 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:50.2056781 +0000 UTC m=+1.227979473,LastTimestamp:2026-03-22 00:08:50.2056781 +0000 UTC m=+1.227979473,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.020635 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189f013aca1701c7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:4e08c320b1e9e2405e6e0107bdf7eeb4,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:50.217501127 +0000 UTC m=+1.239802520,LastTimestamp:2026-03-22 00:08:50.217501127 +0000 UTC m=+1.239802520,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.025596 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013aca4e627c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:50.221130364 +0000 UTC m=+1.243431737,LastTimestamp:2026-03-22 00:08:50.221130364 +0000 UTC m=+1.243431737,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.030012 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189f013aefe1f70a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container: kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:50.851559178 +0000 UTC m=+1.873860571,LastTimestamp:2026-03-22 00:08:50.851559178 +0000 UTC m=+1.873860571,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.036300 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189f013aefe36e0d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:4e08c320b1e9e2405e6e0107bdf7eeb4,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:50.851655181 +0000 UTC m=+1.873956564,LastTimestamp:2026-03-22 00:08:50.851655181 +0000 UTC m=+1.873956564,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.040920 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013aefe5f8d1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:50.851821777 +0000 UTC m=+1.874123150,LastTimestamp:2026-03-22 00:08:50.851821777 +0000 UTC m=+1.874123150,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.045464 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189f013aefe75d14 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container: wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:50.85191298 +0000 UTC m=+1.874214353,LastTimestamp:2026-03-22 00:08:50.85191298 +0000 UTC m=+1.874214353,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.050427 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013aefe87d36 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:50.851986742 +0000 UTC m=+1.874288135,LastTimestamp:2026-03-22 00:08:50.851986742 +0000 UTC m=+1.874288135,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.054688 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189f013af0b5c250 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:50.865439312 +0000 UTC m=+1.887740695,LastTimestamp:2026-03-22 00:08:50.865439312 +0000 UTC m=+1.887740695,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.058876 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013af0bf7732 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:50.866075442 +0000 UTC m=+1.888376835,LastTimestamp:2026-03-22 00:08:50.866075442 +0000 UTC m=+1.888376835,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.066521 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013af0c08bcf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:50.866146255 +0000 UTC m=+1.888447628,LastTimestamp:2026-03-22 00:08:50.866146255 +0000 UTC m=+1.888447628,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.071021 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189f013af0c18fdb openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:4e08c320b1e9e2405e6e0107bdf7eeb4,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:50.866212827 +0000 UTC m=+1.888514200,LastTimestamp:2026-03-22 00:08:50.866212827 +0000 UTC m=+1.888514200,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.075766 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189f013af0c665d3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:50.866529747 +0000 UTC m=+1.888831130,LastTimestamp:2026-03-22 00:08:50.866529747 +0000 UTC m=+1.888831130,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.079974 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189f013af0dc9a75 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:50.867985013 +0000 UTC m=+1.890286386,LastTimestamp:2026-03-22 00:08:50.867985013 +0000 UTC m=+1.890286386,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.085311 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189f013b00b301b1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container: cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.133694385 +0000 UTC m=+2.155995758,LastTimestamp:2026-03-22 00:08:51.133694385 +0000 UTC m=+2.155995758,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.090571 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189f013b014f9a50 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.143957072 +0000 UTC m=+2.166258445,LastTimestamp:2026-03-22 00:08:51.143957072 +0000 UTC m=+2.166258445,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.095611 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189f013b015ea35f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.144942431 +0000 UTC m=+2.167243814,LastTimestamp:2026-03-22 00:08:51.144942431 +0000 UTC m=+2.167243814,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.102026 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189f013b1b045f6a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container: kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.57523441 +0000 UTC m=+2.597535783,LastTimestamp:2026-03-22 00:08:51.57523441 +0000 UTC m=+2.597535783,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.109301 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189f013b1c14273e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.593045822 +0000 UTC m=+2.615347195,LastTimestamp:2026-03-22 00:08:51.593045822 +0000 UTC m=+2.615347195,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.110693 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189f013b1c24a778 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.594127224 +0000 UTC m=+2.616428597,LastTimestamp:2026-03-22 00:08:51.594127224 +0000 UTC m=+2.616428597,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.116067 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013b237610ce openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.716903118 +0000 UTC m=+2.739204491,LastTimestamp:2026-03-22 00:08:51.716903118 +0000 UTC m=+2.739204491,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.123715 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189f013b23afd054 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:4e08c320b1e9e2405e6e0107bdf7eeb4,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.7206877 +0000 UTC m=+2.742989073,LastTimestamp:2026-03-22 00:08:51.7206877 +0000 UTC m=+2.742989073,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.128474 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189f013b240627ff openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.726346239 +0000 UTC m=+2.748647612,LastTimestamp:2026-03-22 00:08:51.726346239 +0000 UTC m=+2.748647612,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.133013 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b243e3ccb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.730021579 +0000 UTC m=+2.752322952,LastTimestamp:2026-03-22 00:08:51.730021579 +0000 UTC m=+2.752322952,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.137679 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189f013b29643338 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container: kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.816395576 +0000 UTC m=+2.838696949,LastTimestamp:2026-03-22 00:08:51.816395576 +0000 UTC m=+2.838696949,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.142824 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189f013b2b476fad openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.848064941 +0000 UTC m=+2.870366314,LastTimestamp:2026-03-22 00:08:51.848064941 +0000 UTC m=+2.870366314,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.147349 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013b2df8d839 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container: etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.893246009 +0000 UTC m=+2.915547382,LastTimestamp:2026-03-22 00:08:51.893246009 +0000 UTC m=+2.915547382,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.154415 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189f013b2f7a7013 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container: kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.918516243 +0000 UTC m=+2.940817616,LastTimestamp:2026-03-22 00:08:51.918516243 +0000 UTC m=+2.940817616,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.160222 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013b2fbd18e1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.922884833 +0000 UTC m=+2.945186206,LastTimestamp:2026-03-22 00:08:51.922884833 +0000 UTC m=+2.945186206,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.165465 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189f013b30a93fe4 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.938361316 +0000 UTC m=+2.960662679,LastTimestamp:2026-03-22 00:08:51.938361316 +0000 UTC m=+2.960662679,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.172617 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189f013b30b92c1a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.939404826 +0000 UTC m=+2.961706199,LastTimestamp:2026-03-22 00:08:51.939404826 +0000 UTC m=+2.961706199,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.177598 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189f013b34285a10 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:4e08c320b1e9e2405e6e0107bdf7eeb4,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.997022736 +0000 UTC m=+3.019324109,LastTimestamp:2026-03-22 00:08:51.997022736 +0000 UTC m=+3.019324109,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.186231 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b35479ca9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container: kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.015848617 +0000 UTC m=+3.038149990,LastTimestamp:2026-03-22 00:08:52.015848617 +0000 UTC m=+3.038149990,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.192237 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189f013b355d600a openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:4e08c320b1e9e2405e6e0107bdf7eeb4,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.01727489 +0000 UTC m=+3.039576263,LastTimestamp:2026-03-22 00:08:52.01727489 +0000 UTC m=+3.039576263,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.198322 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b3613fab5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.029242037 +0000 UTC m=+3.051543410,LastTimestamp:2026-03-22 00:08:52.029242037 +0000 UTC m=+3.051543410,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.206970 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b36977ef0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.037861104 +0000 UTC m=+3.060162477,LastTimestamp:2026-03-22 00:08:52.037861104 +0000 UTC m=+3.060162477,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.211643 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189f013b3dd7f4cf openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container: kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.159526095 +0000 UTC m=+3.181827468,LastTimestamp:2026-03-22 00:08:52.159526095 +0000 UTC m=+3.181827468,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.213067 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189f013b3ecceded openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.175580653 +0000 UTC m=+3.197882026,LastTimestamp:2026-03-22 00:08:52.175580653 +0000 UTC m=+3.197882026,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.216548 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189f013b3edd5e2e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.176657966 +0000 UTC m=+3.198959339,LastTimestamp:2026-03-22 00:08:52.176657966 +0000 UTC m=+3.198959339,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.217732 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b42e74eef openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container: kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.244418287 +0000 UTC m=+3.266719660,LastTimestamp:2026-03-22 00:08:52.244418287 +0000 UTC m=+3.266719660,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.224635 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b43e232a3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.260860579 +0000 UTC m=+3.283161952,LastTimestamp:2026-03-22 00:08:52.260860579 +0000 UTC m=+3.283161952,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.230301 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b43ef7c10 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.261731344 +0000 UTC m=+3.284032717,LastTimestamp:2026-03-22 00:08:52.261731344 +0000 UTC m=+3.284032717,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.239740 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189f013b4d67adc6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container: kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.420603334 +0000 UTC m=+3.442904707,LastTimestamp:2026-03-22 00:08:52.420603334 +0000 UTC m=+3.442904707,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.245011 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189f013b4e0091a7 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.430623143 +0000 UTC m=+3.452924516,LastTimestamp:2026-03-22 00:08:52.430623143 +0000 UTC m=+3.452924516,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.249741 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b50cce364 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container: kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.477567844 +0000 UTC m=+3.499869217,LastTimestamp:2026-03-22 00:08:52.477567844 +0000 UTC m=+3.499869217,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.254813 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b51de25bf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.495476159 +0000 UTC m=+3.517777522,LastTimestamp:2026-03-22 00:08:52.495476159 +0000 UTC m=+3.517777522,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.259642 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b51ee6e6c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.49654334 +0000 UTC m=+3.518844713,LastTimestamp:2026-03-22 00:08:52.49654334 +0000 UTC m=+3.518844713,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.264351 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b5d7976e5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container: kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.690204389 +0000 UTC m=+3.712505762,LastTimestamp:2026-03-22 00:08:52.690204389 +0000 UTC m=+3.712505762,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.269254 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b5e7e9435 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.707316789 +0000 UTC m=+3.729618162,LastTimestamp:2026-03-22 00:08:52.707316789 +0000 UTC m=+3.729618162,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.274203 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b5e8dae9d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.708306589 +0000 UTC m=+3.730607962,LastTimestamp:2026-03-22 00:08:52.708306589 +0000 UTC m=+3.730607962,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.279054 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013b605f8bc0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.73883744 +0000 UTC m=+3.761138813,LastTimestamp:2026-03-22 00:08:52.73883744 +0000 UTC m=+3.761138813,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.285083 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b6e019740 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container: kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.967561024 +0000 UTC m=+3.989862397,LastTimestamp:2026-03-22 00:08:52.967561024 +0000 UTC m=+3.989862397,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.289515 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013b6e07b227 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container: etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.967961127 +0000 UTC m=+3.990262500,LastTimestamp:2026-03-22 00:08:52.967961127 +0000 UTC m=+3.990262500,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.293356 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b6f3968d6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.987996374 +0000 UTC m=+4.010297747,LastTimestamp:2026-03-22 00:08:52.987996374 +0000 UTC m=+4.010297747,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.297276 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013b6f5d5c65 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.990352485 +0000 UTC m=+4.012653858,LastTimestamp:2026-03-22 00:08:52.990352485 +0000 UTC m=+4.012653858,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.300918 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013b9d2aa445 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:53.758780485 +0000 UTC m=+4.781081858,LastTimestamp:2026-03-22 00:08:53.758780485 +0000 UTC m=+4.781081858,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.305310 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013baa563f20 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container: etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:53.979741984 +0000 UTC m=+5.002043357,LastTimestamp:2026-03-22 00:08:53.979741984 +0000 UTC m=+5.002043357,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.309338 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013baae032f2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:53.988782834 +0000 UTC m=+5.011084207,LastTimestamp:2026-03-22 00:08:53.988782834 +0000 UTC m=+5.011084207,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.314045 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013baaecf1c7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:53.989618119 +0000 UTC m=+5.011919492,LastTimestamp:2026-03-22 00:08:53.989618119 +0000 UTC m=+5.011919492,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.317620 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013bb5a1dc16 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container: etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:54.169246742 +0000 UTC m=+5.191548115,LastTimestamp:2026-03-22 00:08:54.169246742 +0000 UTC m=+5.191548115,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.321591 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013bb6a04e4c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:54.185922124 +0000 UTC m=+5.208223507,LastTimestamp:2026-03-22 00:08:54.185922124 +0000 UTC m=+5.208223507,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.325678 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013bb6b5308b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:54.187290763 +0000 UTC m=+5.209592126,LastTimestamp:2026-03-22 00:08:54.187290763 +0000 UTC m=+5.209592126,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.329659 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013bc2888400 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container: etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:54.3856896 +0000 UTC m=+5.407990973,LastTimestamp:2026-03-22 00:08:54.3856896 +0000 UTC m=+5.407990973,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.333595 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013bc344cf4e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:54.398029646 +0000 UTC m=+5.420331019,LastTimestamp:2026-03-22 00:08:54.398029646 +0000 UTC m=+5.420331019,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.337597 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013bc3549ab4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:54.399064756 +0000 UTC m=+5.421366149,LastTimestamp:2026-03-22 00:08:54.399064756 +0000 UTC m=+5.421366149,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.340985 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013bce189c5d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container: etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:54.579682397 +0000 UTC m=+5.601983770,LastTimestamp:2026-03-22 00:08:54.579682397 +0000 UTC m=+5.601983770,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.344588 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013bcecd9341 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:54.591542081 +0000 UTC m=+5.613843444,LastTimestamp:2026-03-22 00:08:54.591542081 +0000 UTC m=+5.613843444,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.348696 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013bcedf89d3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:54.592719315 +0000 UTC m=+5.615020688,LastTimestamp:2026-03-22 00:08:54.592719315 +0000 UTC m=+5.615020688,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.353856 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013bd84b3192 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container: etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:54.750769554 +0000 UTC m=+5.773070927,LastTimestamp:2026-03-22 00:08:54.750769554 +0000 UTC m=+5.773070927,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.357983 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013bd9080982 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:54.763145602 +0000 UTC m=+5.785446975,LastTimestamp:2026-03-22 00:08:54.763145602 +0000 UTC m=+5.785446975,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: I0322 00:09:09.365793 5116 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:57226->192.168.126.11:17697: read: connection reset by peer" start-of-body= Mar 22 00:09:09 crc kubenswrapper[5116]: I0322 00:09:09.365849 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:57226->192.168.126.11:17697: read: connection reset by peer" Mar 22 00:09:09 crc kubenswrapper[5116]: I0322 00:09:09.366129 5116 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 22 00:09:09 crc kubenswrapper[5116]: I0322 00:09:09.366153 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.366272 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 22 00:09:09 crc kubenswrapper[5116]: &Event{ObjectMeta:{kube-controller-manager-crc.189f013ca0bc9e2c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://localhost:10357/healthz": context deadline exceeded Mar 22 00:09:09 crc kubenswrapper[5116]: body: Mar 22 00:09:09 crc kubenswrapper[5116]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:58.113646124 +0000 UTC m=+9.135947487,LastTimestamp:2026-03-22 00:08:58.113646124 +0000 UTC m=+9.135947487,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 22 00:09:09 crc kubenswrapper[5116]: > Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.370737 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189f013ca0bdad78 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://localhost:10357/healthz\": context deadline exceeded,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:58.113715576 +0000 UTC m=+9.136016949,LastTimestamp:2026-03-22 00:08:58.113715576 +0000 UTC m=+9.136016949,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.377248 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 22 00:09:09 crc kubenswrapper[5116]: &Event{ObjectMeta:{kube-apiserver-crc.189f013dbc5fc9e8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Liveness probe error: Get "https://192.168.126.11:17697/healthz": dial tcp 192.168.126.11:17697: connect: connection refused Mar 22 00:09:09 crc kubenswrapper[5116]: body: Mar 22 00:09:09 crc kubenswrapper[5116]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:02.872291816 +0000 UTC m=+13.894593199,LastTimestamp:2026-03-22 00:09:02.872291816 +0000 UTC m=+13.894593199,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 22 00:09:09 crc kubenswrapper[5116]: > Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.382324 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013dbc609578 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Liveness probe failed: Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:02.872343928 +0000 UTC m=+13.894645311,LastTimestamp:2026-03-22 00:09:02.872343928 +0000 UTC m=+13.894645311,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.387306 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 22 00:09:09 crc kubenswrapper[5116]: &Event{ObjectMeta:{kube-apiserver-crc.189f013df9b5205d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 22 00:09:09 crc kubenswrapper[5116]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 22 00:09:09 crc kubenswrapper[5116]: Mar 22 00:09:09 crc kubenswrapper[5116]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:03.901294685 +0000 UTC m=+14.923596058,LastTimestamp:2026-03-22 00:09:03.901294685 +0000 UTC m=+14.923596058,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 22 00:09:09 crc kubenswrapper[5116]: > Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.391665 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013df9b5dd32 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:03.901343026 +0000 UTC m=+14.923644399,LastTimestamp:2026-03-22 00:09:03.901343026 +0000 UTC m=+14.923644399,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.395720 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189f013df9b5205d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 22 00:09:09 crc kubenswrapper[5116]: &Event{ObjectMeta:{kube-apiserver-crc.189f013df9b5205d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 22 00:09:09 crc kubenswrapper[5116]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 22 00:09:09 crc kubenswrapper[5116]: Mar 22 00:09:09 crc kubenswrapper[5116]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:03.901294685 +0000 UTC m=+14.923596058,LastTimestamp:2026-03-22 00:09:03.90654013 +0000 UTC m=+14.928841503,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 22 00:09:09 crc kubenswrapper[5116]: > Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.400013 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189f013df9b5dd32\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013df9b5dd32 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:03.901343026 +0000 UTC m=+14.923644399,LastTimestamp:2026-03-22 00:09:03.906588041 +0000 UTC m=+14.928889414,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.405046 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 22 00:09:09 crc kubenswrapper[5116]: &Event{ObjectMeta:{kube-controller-manager-crc.189f013ef4cf42a9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://localhost:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 22 00:09:09 crc kubenswrapper[5116]: body: Mar 22 00:09:09 crc kubenswrapper[5116]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:08.114088617 +0000 UTC m=+19.136389990,LastTimestamp:2026-03-22 00:09:08.114088617 +0000 UTC m=+19.136389990,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 22 00:09:09 crc kubenswrapper[5116]: > Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.409551 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189f013ef4d0c60f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:08.114187791 +0000 UTC m=+19.136489154,LastTimestamp:2026-03-22 00:09:08.114187791 +0000 UTC m=+19.136489154,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.415340 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 22 00:09:09 crc kubenswrapper[5116]: &Event{ObjectMeta:{kube-apiserver-crc.189f013f3f6b3d2e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.126.11:17697/healthz": read tcp 192.168.126.11:57226->192.168.126.11:17697: read: connection reset by peer Mar 22 00:09:09 crc kubenswrapper[5116]: body: Mar 22 00:09:09 crc kubenswrapper[5116]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:09.365824814 +0000 UTC m=+20.388126187,LastTimestamp:2026-03-22 00:09:09.365824814 +0000 UTC m=+20.388126187,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 22 00:09:09 crc kubenswrapper[5116]: > Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.420518 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013f3f6c9857 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:57226->192.168.126.11:17697: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:09.365913687 +0000 UTC m=+20.388215060,LastTimestamp:2026-03-22 00:09:09.365913687 +0000 UTC m=+20.388215060,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.425393 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 22 00:09:09 crc kubenswrapper[5116]: &Event{ObjectMeta:{kube-apiserver-crc.189f013f3f7019f2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.126.11:17697/healthz": dial tcp 192.168.126.11:17697: connect: connection refused Mar 22 00:09:09 crc kubenswrapper[5116]: body: Mar 22 00:09:09 crc kubenswrapper[5116]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:09.366143474 +0000 UTC m=+20.388444857,LastTimestamp:2026-03-22 00:09:09.366143474 +0000 UTC m=+20.388444857,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 22 00:09:09 crc kubenswrapper[5116]: > Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.430114 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013f3f70ecda openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:09.366197466 +0000 UTC m=+20.388498849,LastTimestamp:2026-03-22 00:09:09.366197466 +0000 UTC m=+20.388498849,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: I0322 00:09:09.600919 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.737630 5116 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 22 00:09:09 crc kubenswrapper[5116]: I0322 00:09:09.809192 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/0.log" Mar 22 00:09:09 crc kubenswrapper[5116]: I0322 00:09:09.811365 5116 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="e2c881c727397234f9ea0049cb8b50cf8c351a23d8feaf545b93e1a981e673d6" exitCode=255 Mar 22 00:09:09 crc kubenswrapper[5116]: I0322 00:09:09.811471 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerDied","Data":"e2c881c727397234f9ea0049cb8b50cf8c351a23d8feaf545b93e1a981e673d6"} Mar 22 00:09:09 crc kubenswrapper[5116]: I0322 00:09:09.811825 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:09 crc kubenswrapper[5116]: I0322 00:09:09.812638 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:09 crc kubenswrapper[5116]: I0322 00:09:09.812688 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:09 crc kubenswrapper[5116]: I0322 00:09:09.812705 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.813109 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:09 crc kubenswrapper[5116]: I0322 00:09:09.813404 5116 scope.go:117] "RemoveContainer" containerID="e2c881c727397234f9ea0049cb8b50cf8c351a23d8feaf545b93e1a981e673d6" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.823987 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189f013b5e8dae9d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b5e8dae9d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.708306589 +0000 UTC m=+3.730607962,LastTimestamp:2026-03-22 00:09:09.814470695 +0000 UTC m=+20.836772068,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:10 crc kubenswrapper[5116]: E0322 00:09:10.062911 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189f013b6e019740\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b6e019740 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container: kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.967561024 +0000 UTC m=+3.989862397,LastTimestamp:2026-03-22 00:09:10.056907501 +0000 UTC m=+21.079208874,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:10 crc kubenswrapper[5116]: E0322 00:09:10.072068 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189f013b6f3968d6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b6f3968d6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.987996374 +0000 UTC m=+4.010297747,LastTimestamp:2026-03-22 00:09:10.066933437 +0000 UTC m=+21.089234810,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:10 crc kubenswrapper[5116]: I0322 00:09:10.600076 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:10 crc kubenswrapper[5116]: I0322 00:09:10.815349 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/1.log" Mar 22 00:09:10 crc kubenswrapper[5116]: I0322 00:09:10.815907 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/0.log" Mar 22 00:09:10 crc kubenswrapper[5116]: I0322 00:09:10.817207 5116 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="ae6177d08f823bd2a233290158c1f293be39581f2fbec2469c518a66465fbdba" exitCode=255 Mar 22 00:09:10 crc kubenswrapper[5116]: I0322 00:09:10.817252 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerDied","Data":"ae6177d08f823bd2a233290158c1f293be39581f2fbec2469c518a66465fbdba"} Mar 22 00:09:10 crc kubenswrapper[5116]: I0322 00:09:10.817287 5116 scope.go:117] "RemoveContainer" containerID="e2c881c727397234f9ea0049cb8b50cf8c351a23d8feaf545b93e1a981e673d6" Mar 22 00:09:10 crc kubenswrapper[5116]: I0322 00:09:10.817474 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:10 crc kubenswrapper[5116]: I0322 00:09:10.818008 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:10 crc kubenswrapper[5116]: I0322 00:09:10.818039 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:10 crc kubenswrapper[5116]: I0322 00:09:10.818052 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:10 crc kubenswrapper[5116]: E0322 00:09:10.818432 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:10 crc kubenswrapper[5116]: I0322 00:09:10.818682 5116 scope.go:117] "RemoveContainer" containerID="ae6177d08f823bd2a233290158c1f293be39581f2fbec2469c518a66465fbdba" Mar 22 00:09:10 crc kubenswrapper[5116]: E0322 00:09:10.818932 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" Mar 22 00:09:10 crc kubenswrapper[5116]: E0322 00:09:10.824653 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013f9607171e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:BackOff,Message:Back-off restarting failed container kube-apiserver-check-endpoints in pod kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:10.818879262 +0000 UTC m=+21.841180635,LastTimestamp:2026-03-22 00:09:10.818879262 +0000 UTC m=+21.841180635,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:11 crc kubenswrapper[5116]: I0322 00:09:11.600147 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:11 crc kubenswrapper[5116]: I0322 00:09:11.821724 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/1.log" Mar 22 00:09:11 crc kubenswrapper[5116]: I0322 00:09:11.878648 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 22 00:09:11 crc kubenswrapper[5116]: I0322 00:09:11.879226 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:11 crc kubenswrapper[5116]: I0322 00:09:11.880069 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:11 crc kubenswrapper[5116]: I0322 00:09:11.880122 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:11 crc kubenswrapper[5116]: I0322 00:09:11.880136 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:11 crc kubenswrapper[5116]: E0322 00:09:11.880675 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:11 crc kubenswrapper[5116]: I0322 00:09:11.894518 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 22 00:09:12 crc kubenswrapper[5116]: E0322 00:09:12.230836 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 22 00:09:12 crc kubenswrapper[5116]: I0322 00:09:12.599792 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:12 crc kubenswrapper[5116]: I0322 00:09:12.826263 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:12 crc kubenswrapper[5116]: I0322 00:09:12.826865 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:12 crc kubenswrapper[5116]: I0322 00:09:12.826905 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:12 crc kubenswrapper[5116]: I0322 00:09:12.826917 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:12 crc kubenswrapper[5116]: E0322 00:09:12.827293 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:12 crc kubenswrapper[5116]: I0322 00:09:12.871040 5116 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:09:12 crc kubenswrapper[5116]: I0322 00:09:12.871605 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:12 crc kubenswrapper[5116]: I0322 00:09:12.872375 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:12 crc kubenswrapper[5116]: I0322 00:09:12.872403 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:12 crc kubenswrapper[5116]: I0322 00:09:12.872414 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:12 crc kubenswrapper[5116]: E0322 00:09:12.872771 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:12 crc kubenswrapper[5116]: I0322 00:09:12.873025 5116 scope.go:117] "RemoveContainer" containerID="ae6177d08f823bd2a233290158c1f293be39581f2fbec2469c518a66465fbdba" Mar 22 00:09:12 crc kubenswrapper[5116]: E0322 00:09:12.873246 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" Mar 22 00:09:12 crc kubenswrapper[5116]: E0322 00:09:12.877397 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189f013f9607171e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013f9607171e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:BackOff,Message:Back-off restarting failed container kube-apiserver-check-endpoints in pod kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:10.818879262 +0000 UTC m=+21.841180635,LastTimestamp:2026-03-22 00:09:12.873214748 +0000 UTC m=+23.895516121,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:13 crc kubenswrapper[5116]: I0322 00:09:13.599652 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:14 crc kubenswrapper[5116]: E0322 00:09:14.570912 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 22 00:09:14 crc kubenswrapper[5116]: I0322 00:09:14.602691 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:15 crc kubenswrapper[5116]: I0322 00:09:15.121112 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:09:15 crc kubenswrapper[5116]: I0322 00:09:15.121424 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:15 crc kubenswrapper[5116]: I0322 00:09:15.122396 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:15 crc kubenswrapper[5116]: I0322 00:09:15.122437 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:15 crc kubenswrapper[5116]: I0322 00:09:15.122450 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:15 crc kubenswrapper[5116]: E0322 00:09:15.122870 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:15 crc kubenswrapper[5116]: I0322 00:09:15.128566 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:09:15 crc kubenswrapper[5116]: I0322 00:09:15.304297 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:15 crc kubenswrapper[5116]: I0322 00:09:15.305831 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:15 crc kubenswrapper[5116]: I0322 00:09:15.305955 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:15 crc kubenswrapper[5116]: I0322 00:09:15.305979 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:15 crc kubenswrapper[5116]: I0322 00:09:15.306020 5116 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 22 00:09:15 crc kubenswrapper[5116]: E0322 00:09:15.321944 5116 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 22 00:09:15 crc kubenswrapper[5116]: I0322 00:09:15.599941 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:15 crc kubenswrapper[5116]: I0322 00:09:15.832370 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:15 crc kubenswrapper[5116]: I0322 00:09:15.833022 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:15 crc kubenswrapper[5116]: I0322 00:09:15.833083 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:15 crc kubenswrapper[5116]: I0322 00:09:15.833095 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:15 crc kubenswrapper[5116]: E0322 00:09:15.833424 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:16 crc kubenswrapper[5116]: E0322 00:09:16.073778 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 22 00:09:16 crc kubenswrapper[5116]: E0322 00:09:16.553142 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 22 00:09:16 crc kubenswrapper[5116]: I0322 00:09:16.602564 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:16 crc kubenswrapper[5116]: E0322 00:09:16.740598 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 22 00:09:17 crc kubenswrapper[5116]: I0322 00:09:17.602374 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:18 crc kubenswrapper[5116]: I0322 00:09:18.600572 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:19 crc kubenswrapper[5116]: E0322 00:09:19.235625 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 22 00:09:19 crc kubenswrapper[5116]: I0322 00:09:19.366679 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:09:19 crc kubenswrapper[5116]: I0322 00:09:19.366922 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:19 crc kubenswrapper[5116]: I0322 00:09:19.367725 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:19 crc kubenswrapper[5116]: I0322 00:09:19.367837 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:19 crc kubenswrapper[5116]: I0322 00:09:19.367924 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:19 crc kubenswrapper[5116]: E0322 00:09:19.368440 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:19 crc kubenswrapper[5116]: I0322 00:09:19.368781 5116 scope.go:117] "RemoveContainer" containerID="ae6177d08f823bd2a233290158c1f293be39581f2fbec2469c518a66465fbdba" Mar 22 00:09:19 crc kubenswrapper[5116]: E0322 00:09:19.369033 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" Mar 22 00:09:19 crc kubenswrapper[5116]: E0322 00:09:19.373452 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189f013f9607171e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013f9607171e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:BackOff,Message:Back-off restarting failed container kube-apiserver-check-endpoints in pod kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:10.818879262 +0000 UTC m=+21.841180635,LastTimestamp:2026-03-22 00:09:19.369003969 +0000 UTC m=+30.391305342,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:19 crc kubenswrapper[5116]: I0322 00:09:19.604891 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:19 crc kubenswrapper[5116]: E0322 00:09:19.737981 5116 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 22 00:09:20 crc kubenswrapper[5116]: I0322 00:09:20.598232 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:21 crc kubenswrapper[5116]: I0322 00:09:21.598722 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:22 crc kubenswrapper[5116]: I0322 00:09:22.322103 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:22 crc kubenswrapper[5116]: I0322 00:09:22.323149 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:22 crc kubenswrapper[5116]: I0322 00:09:22.323271 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:22 crc kubenswrapper[5116]: I0322 00:09:22.323299 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:22 crc kubenswrapper[5116]: I0322 00:09:22.323335 5116 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 22 00:09:22 crc kubenswrapper[5116]: E0322 00:09:22.336574 5116 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 22 00:09:22 crc kubenswrapper[5116]: I0322 00:09:22.598736 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:23 crc kubenswrapper[5116]: I0322 00:09:23.596261 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:24 crc kubenswrapper[5116]: I0322 00:09:24.603737 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:25 crc kubenswrapper[5116]: E0322 00:09:25.376668 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 22 00:09:25 crc kubenswrapper[5116]: I0322 00:09:25.599537 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:26 crc kubenswrapper[5116]: E0322 00:09:26.241828 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 22 00:09:26 crc kubenswrapper[5116]: I0322 00:09:26.602971 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:27 crc kubenswrapper[5116]: I0322 00:09:27.600813 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:28 crc kubenswrapper[5116]: I0322 00:09:28.600364 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:29 crc kubenswrapper[5116]: I0322 00:09:29.337483 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:29 crc kubenswrapper[5116]: I0322 00:09:29.339710 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:29 crc kubenswrapper[5116]: I0322 00:09:29.339782 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:29 crc kubenswrapper[5116]: I0322 00:09:29.339804 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:29 crc kubenswrapper[5116]: I0322 00:09:29.339843 5116 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 22 00:09:29 crc kubenswrapper[5116]: E0322 00:09:29.352881 5116 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 22 00:09:29 crc kubenswrapper[5116]: I0322 00:09:29.600711 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:29 crc kubenswrapper[5116]: E0322 00:09:29.738831 5116 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 22 00:09:30 crc kubenswrapper[5116]: E0322 00:09:30.208105 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 22 00:09:30 crc kubenswrapper[5116]: I0322 00:09:30.600371 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:31 crc kubenswrapper[5116]: I0322 00:09:31.601277 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:32 crc kubenswrapper[5116]: E0322 00:09:32.365276 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 22 00:09:32 crc kubenswrapper[5116]: I0322 00:09:32.603385 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:32 crc kubenswrapper[5116]: E0322 00:09:32.993563 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 22 00:09:33 crc kubenswrapper[5116]: E0322 00:09:33.248415 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 22 00:09:33 crc kubenswrapper[5116]: I0322 00:09:33.600059 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:33 crc kubenswrapper[5116]: I0322 00:09:33.697692 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:33 crc kubenswrapper[5116]: I0322 00:09:33.699438 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:33 crc kubenswrapper[5116]: I0322 00:09:33.699503 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:33 crc kubenswrapper[5116]: I0322 00:09:33.699519 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:33 crc kubenswrapper[5116]: E0322 00:09:33.700065 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:33 crc kubenswrapper[5116]: I0322 00:09:33.700448 5116 scope.go:117] "RemoveContainer" containerID="ae6177d08f823bd2a233290158c1f293be39581f2fbec2469c518a66465fbdba" Mar 22 00:09:33 crc kubenswrapper[5116]: E0322 00:09:33.708653 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189f013b5e8dae9d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b5e8dae9d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.708306589 +0000 UTC m=+3.730607962,LastTimestamp:2026-03-22 00:09:33.702387974 +0000 UTC m=+44.724689377,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:33 crc kubenswrapper[5116]: E0322 00:09:33.921919 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189f013b6e019740\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b6e019740 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container: kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.967561024 +0000 UTC m=+3.989862397,LastTimestamp:2026-03-22 00:09:33.917243181 +0000 UTC m=+44.939544554,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:33 crc kubenswrapper[5116]: E0322 00:09:33.937638 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189f013b6f3968d6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b6f3968d6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.987996374 +0000 UTC m=+4.010297747,LastTimestamp:2026-03-22 00:09:33.931114168 +0000 UTC m=+44.953415561,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:34 crc kubenswrapper[5116]: I0322 00:09:34.604253 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:34 crc kubenswrapper[5116]: I0322 00:09:34.880966 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/1.log" Mar 22 00:09:34 crc kubenswrapper[5116]: I0322 00:09:34.883539 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"23ac57165ac4f4feba592e88c6e37bd923a9b307d259c6f668b5e9ac066da454"} Mar 22 00:09:34 crc kubenswrapper[5116]: I0322 00:09:34.883830 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:34 crc kubenswrapper[5116]: I0322 00:09:34.895430 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:34 crc kubenswrapper[5116]: I0322 00:09:34.895501 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:34 crc kubenswrapper[5116]: I0322 00:09:34.895527 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:34 crc kubenswrapper[5116]: E0322 00:09:34.900736 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:35 crc kubenswrapper[5116]: I0322 00:09:35.602710 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:35 crc kubenswrapper[5116]: I0322 00:09:35.888267 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/2.log" Mar 22 00:09:35 crc kubenswrapper[5116]: I0322 00:09:35.888782 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/1.log" Mar 22 00:09:35 crc kubenswrapper[5116]: I0322 00:09:35.890509 5116 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="23ac57165ac4f4feba592e88c6e37bd923a9b307d259c6f668b5e9ac066da454" exitCode=255 Mar 22 00:09:35 crc kubenswrapper[5116]: I0322 00:09:35.890578 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerDied","Data":"23ac57165ac4f4feba592e88c6e37bd923a9b307d259c6f668b5e9ac066da454"} Mar 22 00:09:35 crc kubenswrapper[5116]: I0322 00:09:35.890636 5116 scope.go:117] "RemoveContainer" containerID="ae6177d08f823bd2a233290158c1f293be39581f2fbec2469c518a66465fbdba" Mar 22 00:09:35 crc kubenswrapper[5116]: I0322 00:09:35.890841 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:35 crc kubenswrapper[5116]: I0322 00:09:35.891405 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:35 crc kubenswrapper[5116]: I0322 00:09:35.891440 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:35 crc kubenswrapper[5116]: I0322 00:09:35.891453 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:35 crc kubenswrapper[5116]: E0322 00:09:35.891783 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:35 crc kubenswrapper[5116]: I0322 00:09:35.892051 5116 scope.go:117] "RemoveContainer" containerID="23ac57165ac4f4feba592e88c6e37bd923a9b307d259c6f668b5e9ac066da454" Mar 22 00:09:35 crc kubenswrapper[5116]: E0322 00:09:35.896196 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" Mar 22 00:09:35 crc kubenswrapper[5116]: E0322 00:09:35.901571 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189f013f9607171e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013f9607171e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:BackOff,Message:Back-off restarting failed container kube-apiserver-check-endpoints in pod kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:10.818879262 +0000 UTC m=+21.841180635,LastTimestamp:2026-03-22 00:09:35.896138121 +0000 UTC m=+46.918439494,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:36 crc kubenswrapper[5116]: I0322 00:09:36.353234 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:36 crc kubenswrapper[5116]: I0322 00:09:36.354187 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:36 crc kubenswrapper[5116]: I0322 00:09:36.354255 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:36 crc kubenswrapper[5116]: I0322 00:09:36.354271 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:36 crc kubenswrapper[5116]: I0322 00:09:36.354296 5116 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 22 00:09:36 crc kubenswrapper[5116]: E0322 00:09:36.367962 5116 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 22 00:09:36 crc kubenswrapper[5116]: I0322 00:09:36.599496 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:36 crc kubenswrapper[5116]: I0322 00:09:36.895129 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/2.log" Mar 22 00:09:37 crc kubenswrapper[5116]: I0322 00:09:37.599536 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:38 crc kubenswrapper[5116]: I0322 00:09:38.603417 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:39 crc kubenswrapper[5116]: I0322 00:09:39.600472 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:39 crc kubenswrapper[5116]: E0322 00:09:39.739316 5116 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 22 00:09:40 crc kubenswrapper[5116]: E0322 00:09:40.254713 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 22 00:09:40 crc kubenswrapper[5116]: I0322 00:09:40.600325 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:41 crc kubenswrapper[5116]: I0322 00:09:41.603082 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:42 crc kubenswrapper[5116]: I0322 00:09:42.600759 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:42 crc kubenswrapper[5116]: I0322 00:09:42.871981 5116 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:09:42 crc kubenswrapper[5116]: I0322 00:09:42.872249 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:42 crc kubenswrapper[5116]: I0322 00:09:42.873080 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:42 crc kubenswrapper[5116]: I0322 00:09:42.873234 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:42 crc kubenswrapper[5116]: I0322 00:09:42.873264 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:42 crc kubenswrapper[5116]: E0322 00:09:42.873848 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:42 crc kubenswrapper[5116]: I0322 00:09:42.874280 5116 scope.go:117] "RemoveContainer" containerID="23ac57165ac4f4feba592e88c6e37bd923a9b307d259c6f668b5e9ac066da454" Mar 22 00:09:42 crc kubenswrapper[5116]: E0322 00:09:42.874670 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" Mar 22 00:09:42 crc kubenswrapper[5116]: E0322 00:09:42.880089 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189f013f9607171e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013f9607171e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:BackOff,Message:Back-off restarting failed container kube-apiserver-check-endpoints in pod kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:10.818879262 +0000 UTC m=+21.841180635,LastTimestamp:2026-03-22 00:09:42.874607975 +0000 UTC m=+53.896909378,Count:5,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:43 crc kubenswrapper[5116]: E0322 00:09:43.065508 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 22 00:09:43 crc kubenswrapper[5116]: I0322 00:09:43.368989 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:43 crc kubenswrapper[5116]: I0322 00:09:43.370030 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:43 crc kubenswrapper[5116]: I0322 00:09:43.370089 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:43 crc kubenswrapper[5116]: I0322 00:09:43.370113 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:43 crc kubenswrapper[5116]: I0322 00:09:43.370154 5116 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 22 00:09:43 crc kubenswrapper[5116]: E0322 00:09:43.377487 5116 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 22 00:09:43 crc kubenswrapper[5116]: I0322 00:09:43.602859 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:44 crc kubenswrapper[5116]: I0322 00:09:44.601081 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:44 crc kubenswrapper[5116]: I0322 00:09:44.773759 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 22 00:09:44 crc kubenswrapper[5116]: I0322 00:09:44.774081 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:44 crc kubenswrapper[5116]: I0322 00:09:44.775161 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:44 crc kubenswrapper[5116]: I0322 00:09:44.775270 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:44 crc kubenswrapper[5116]: I0322 00:09:44.775290 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:44 crc kubenswrapper[5116]: E0322 00:09:44.775814 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:44 crc kubenswrapper[5116]: I0322 00:09:44.884223 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:09:44 crc kubenswrapper[5116]: I0322 00:09:44.884545 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:44 crc kubenswrapper[5116]: I0322 00:09:44.885670 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:44 crc kubenswrapper[5116]: I0322 00:09:44.885733 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:44 crc kubenswrapper[5116]: I0322 00:09:44.885753 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:44 crc kubenswrapper[5116]: E0322 00:09:44.886592 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:44 crc kubenswrapper[5116]: I0322 00:09:44.887104 5116 scope.go:117] "RemoveContainer" containerID="23ac57165ac4f4feba592e88c6e37bd923a9b307d259c6f668b5e9ac066da454" Mar 22 00:09:44 crc kubenswrapper[5116]: E0322 00:09:44.887549 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" Mar 22 00:09:44 crc kubenswrapper[5116]: E0322 00:09:44.893285 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189f013f9607171e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013f9607171e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:BackOff,Message:Back-off restarting failed container kube-apiserver-check-endpoints in pod kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:10.818879262 +0000 UTC m=+21.841180635,LastTimestamp:2026-03-22 00:09:44.887481395 +0000 UTC m=+55.909782808,Count:6,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:45 crc kubenswrapper[5116]: I0322 00:09:45.602757 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:46 crc kubenswrapper[5116]: I0322 00:09:46.603160 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:47 crc kubenswrapper[5116]: E0322 00:09:47.261531 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 22 00:09:47 crc kubenswrapper[5116]: I0322 00:09:47.599569 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:48 crc kubenswrapper[5116]: I0322 00:09:48.602938 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:49 crc kubenswrapper[5116]: I0322 00:09:49.601294 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:49 crc kubenswrapper[5116]: E0322 00:09:49.740478 5116 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 22 00:09:50 crc kubenswrapper[5116]: I0322 00:09:50.377860 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:50 crc kubenswrapper[5116]: I0322 00:09:50.379141 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:50 crc kubenswrapper[5116]: I0322 00:09:50.379253 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:50 crc kubenswrapper[5116]: I0322 00:09:50.379280 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:50 crc kubenswrapper[5116]: I0322 00:09:50.379317 5116 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 22 00:09:50 crc kubenswrapper[5116]: E0322 00:09:50.396946 5116 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 22 00:09:50 crc kubenswrapper[5116]: I0322 00:09:50.599929 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:51 crc kubenswrapper[5116]: I0322 00:09:51.600666 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:52 crc kubenswrapper[5116]: I0322 00:09:52.601383 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:53 crc kubenswrapper[5116]: I0322 00:09:53.604269 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:54 crc kubenswrapper[5116]: E0322 00:09:54.269633 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 22 00:09:54 crc kubenswrapper[5116]: I0322 00:09:54.598692 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:55 crc kubenswrapper[5116]: I0322 00:09:55.007098 5116 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-jq45n" Mar 22 00:09:55 crc kubenswrapper[5116]: I0322 00:09:55.019596 5116 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-jq45n" Mar 22 00:09:55 crc kubenswrapper[5116]: I0322 00:09:55.103729 5116 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 22 00:09:55 crc kubenswrapper[5116]: I0322 00:09:55.447339 5116 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 22 00:09:56 crc kubenswrapper[5116]: I0322 00:09:56.021360 5116 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2026-04-21 00:04:55 +0000 UTC" deadline="2026-04-17 17:49:21.420122729 +0000 UTC" Mar 22 00:09:56 crc kubenswrapper[5116]: I0322 00:09:56.021423 5116 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="641h39m25.398705341s" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.398152 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.399336 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.399470 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.399556 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.399757 5116 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.407112 5116 kubelet_node_status.go:127] "Node was previously registered" node="crc" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.407523 5116 kubelet_node_status.go:81] "Successfully registered node" node="crc" Mar 22 00:09:57 crc kubenswrapper[5116]: E0322 00:09:57.407617 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.410517 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.410558 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.410571 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.410589 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.410604 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:09:57Z","lastTransitionTime":"2026-03-22T00:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:09:57 crc kubenswrapper[5116]: E0322 00:09:57.430804 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400456Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861256Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e17d39b-4bf4-4f5d-b01b-aaffc38eb890\\\",\\\"systemUUID\\\":\\\"6f1c9f09-bd93-4412-afb3-903004a8bcf7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.438489 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.438553 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.438573 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.438594 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.438609 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:09:57Z","lastTransitionTime":"2026-03-22T00:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:09:57 crc kubenswrapper[5116]: E0322 00:09:57.448957 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400456Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861256Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e17d39b-4bf4-4f5d-b01b-aaffc38eb890\\\",\\\"systemUUID\\\":\\\"6f1c9f09-bd93-4412-afb3-903004a8bcf7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.456560 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.456608 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.456626 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.456641 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.456652 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:09:57Z","lastTransitionTime":"2026-03-22T00:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:09:57 crc kubenswrapper[5116]: E0322 00:09:57.465834 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400456Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861256Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e17d39b-4bf4-4f5d-b01b-aaffc38eb890\\\",\\\"systemUUID\\\":\\\"6f1c9f09-bd93-4412-afb3-903004a8bcf7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.472125 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.472151 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.472160 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.472191 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.472204 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:09:57Z","lastTransitionTime":"2026-03-22T00:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:09:57 crc kubenswrapper[5116]: E0322 00:09:57.480488 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400456Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861256Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e17d39b-4bf4-4f5d-b01b-aaffc38eb890\\\",\\\"systemUUID\\\":\\\"6f1c9f09-bd93-4412-afb3-903004a8bcf7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:09:57 crc kubenswrapper[5116]: E0322 00:09:57.480802 5116 kubelet_node_status.go:584] "Unable to update node status" err="update node status exceeds retry count" Mar 22 00:09:57 crc kubenswrapper[5116]: E0322 00:09:57.480885 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:57 crc kubenswrapper[5116]: E0322 00:09:57.581392 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:57 crc kubenswrapper[5116]: E0322 00:09:57.681606 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:57 crc kubenswrapper[5116]: E0322 00:09:57.782204 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:57 crc kubenswrapper[5116]: E0322 00:09:57.883376 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:57 crc kubenswrapper[5116]: E0322 00:09:57.984704 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:58 crc kubenswrapper[5116]: E0322 00:09:58.086424 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:58 crc kubenswrapper[5116]: E0322 00:09:58.186657 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:58 crc kubenswrapper[5116]: E0322 00:09:58.287098 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:58 crc kubenswrapper[5116]: E0322 00:09:58.387843 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:58 crc kubenswrapper[5116]: E0322 00:09:58.488702 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:58 crc kubenswrapper[5116]: E0322 00:09:58.589771 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:58 crc kubenswrapper[5116]: E0322 00:09:58.690019 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:58 crc kubenswrapper[5116]: I0322 00:09:58.697404 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:58 crc kubenswrapper[5116]: I0322 00:09:58.698095 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:58 crc kubenswrapper[5116]: I0322 00:09:58.698132 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:58 crc kubenswrapper[5116]: I0322 00:09:58.698145 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:58 crc kubenswrapper[5116]: E0322 00:09:58.698517 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:58 crc kubenswrapper[5116]: I0322 00:09:58.698726 5116 scope.go:117] "RemoveContainer" containerID="23ac57165ac4f4feba592e88c6e37bd923a9b307d259c6f668b5e9ac066da454" Mar 22 00:09:58 crc kubenswrapper[5116]: E0322 00:09:58.790080 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:58 crc kubenswrapper[5116]: E0322 00:09:58.891033 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:58 crc kubenswrapper[5116]: I0322 00:09:58.955836 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/2.log" Mar 22 00:09:58 crc kubenswrapper[5116]: I0322 00:09:58.957554 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"4ec1f0e4053fa1e136a94ad86e588cc0fd43b29333734b120fe3d6175c1913a8"} Mar 22 00:09:58 crc kubenswrapper[5116]: I0322 00:09:58.957764 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:58 crc kubenswrapper[5116]: I0322 00:09:58.958415 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:58 crc kubenswrapper[5116]: I0322 00:09:58.958449 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:58 crc kubenswrapper[5116]: I0322 00:09:58.958461 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:58 crc kubenswrapper[5116]: E0322 00:09:58.958835 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:58 crc kubenswrapper[5116]: E0322 00:09:58.992045 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:59 crc kubenswrapper[5116]: E0322 00:09:59.092135 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:59 crc kubenswrapper[5116]: E0322 00:09:59.193116 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:59 crc kubenswrapper[5116]: E0322 00:09:59.294320 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:59 crc kubenswrapper[5116]: E0322 00:09:59.395270 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:59 crc kubenswrapper[5116]: E0322 00:09:59.495494 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:59 crc kubenswrapper[5116]: E0322 00:09:59.595676 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:59 crc kubenswrapper[5116]: E0322 00:09:59.696249 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:59 crc kubenswrapper[5116]: E0322 00:09:59.741411 5116 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 22 00:09:59 crc kubenswrapper[5116]: E0322 00:09:59.797242 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:59 crc kubenswrapper[5116]: E0322 00:09:59.897696 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:59 crc kubenswrapper[5116]: E0322 00:09:59.998785 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:00 crc kubenswrapper[5116]: E0322 00:10:00.099856 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:00 crc kubenswrapper[5116]: E0322 00:10:00.200503 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:00 crc kubenswrapper[5116]: E0322 00:10:00.301158 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:00 crc kubenswrapper[5116]: E0322 00:10:00.401678 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:00 crc kubenswrapper[5116]: E0322 00:10:00.502435 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:00 crc kubenswrapper[5116]: E0322 00:10:00.602849 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:00 crc kubenswrapper[5116]: E0322 00:10:00.703959 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:00 crc kubenswrapper[5116]: E0322 00:10:00.805003 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:00 crc kubenswrapper[5116]: E0322 00:10:00.905963 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:00 crc kubenswrapper[5116]: I0322 00:10:00.964023 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/3.log" Mar 22 00:10:00 crc kubenswrapper[5116]: I0322 00:10:00.964674 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/2.log" Mar 22 00:10:00 crc kubenswrapper[5116]: I0322 00:10:00.966760 5116 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="4ec1f0e4053fa1e136a94ad86e588cc0fd43b29333734b120fe3d6175c1913a8" exitCode=255 Mar 22 00:10:00 crc kubenswrapper[5116]: I0322 00:10:00.966801 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerDied","Data":"4ec1f0e4053fa1e136a94ad86e588cc0fd43b29333734b120fe3d6175c1913a8"} Mar 22 00:10:00 crc kubenswrapper[5116]: I0322 00:10:00.966836 5116 scope.go:117] "RemoveContainer" containerID="23ac57165ac4f4feba592e88c6e37bd923a9b307d259c6f668b5e9ac066da454" Mar 22 00:10:00 crc kubenswrapper[5116]: I0322 00:10:00.967142 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:10:00 crc kubenswrapper[5116]: I0322 00:10:00.968037 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:00 crc kubenswrapper[5116]: I0322 00:10:00.968096 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:00 crc kubenswrapper[5116]: I0322 00:10:00.968119 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:00 crc kubenswrapper[5116]: E0322 00:10:00.969106 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:10:00 crc kubenswrapper[5116]: I0322 00:10:00.969706 5116 scope.go:117] "RemoveContainer" containerID="4ec1f0e4053fa1e136a94ad86e588cc0fd43b29333734b120fe3d6175c1913a8" Mar 22 00:10:00 crc kubenswrapper[5116]: E0322 00:10:00.978317 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" Mar 22 00:10:01 crc kubenswrapper[5116]: E0322 00:10:01.006492 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:01 crc kubenswrapper[5116]: E0322 00:10:01.106846 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:01 crc kubenswrapper[5116]: E0322 00:10:01.207988 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:01 crc kubenswrapper[5116]: E0322 00:10:01.308373 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:01 crc kubenswrapper[5116]: E0322 00:10:01.409580 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:01 crc kubenswrapper[5116]: E0322 00:10:01.510380 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:01 crc kubenswrapper[5116]: E0322 00:10:01.611239 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:01 crc kubenswrapper[5116]: E0322 00:10:01.711815 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:01 crc kubenswrapper[5116]: E0322 00:10:01.812627 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:01 crc kubenswrapper[5116]: E0322 00:10:01.913784 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:01 crc kubenswrapper[5116]: I0322 00:10:01.969711 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/3.log" Mar 22 00:10:02 crc kubenswrapper[5116]: E0322 00:10:02.014880 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:02 crc kubenswrapper[5116]: E0322 00:10:02.115929 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:02 crc kubenswrapper[5116]: E0322 00:10:02.216951 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:02 crc kubenswrapper[5116]: E0322 00:10:02.317237 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:02 crc kubenswrapper[5116]: I0322 00:10:02.338147 5116 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Mar 22 00:10:02 crc kubenswrapper[5116]: E0322 00:10:02.417726 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:02 crc kubenswrapper[5116]: E0322 00:10:02.518817 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:02 crc kubenswrapper[5116]: E0322 00:10:02.619247 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:02 crc kubenswrapper[5116]: E0322 00:10:02.719677 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:02 crc kubenswrapper[5116]: E0322 00:10:02.820161 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:02 crc kubenswrapper[5116]: I0322 00:10:02.871156 5116 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:10:02 crc kubenswrapper[5116]: I0322 00:10:02.871506 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:10:02 crc kubenswrapper[5116]: I0322 00:10:02.872558 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:02 crc kubenswrapper[5116]: I0322 00:10:02.872645 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:02 crc kubenswrapper[5116]: I0322 00:10:02.872657 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:02 crc kubenswrapper[5116]: E0322 00:10:02.873306 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:10:02 crc kubenswrapper[5116]: I0322 00:10:02.873676 5116 scope.go:117] "RemoveContainer" containerID="4ec1f0e4053fa1e136a94ad86e588cc0fd43b29333734b120fe3d6175c1913a8" Mar 22 00:10:02 crc kubenswrapper[5116]: E0322 00:10:02.873960 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" Mar 22 00:10:02 crc kubenswrapper[5116]: E0322 00:10:02.920669 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:03 crc kubenswrapper[5116]: E0322 00:10:03.020788 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:03 crc kubenswrapper[5116]: E0322 00:10:03.121116 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:03 crc kubenswrapper[5116]: E0322 00:10:03.221540 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:03 crc kubenswrapper[5116]: E0322 00:10:03.322318 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:03 crc kubenswrapper[5116]: E0322 00:10:03.422422 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:03 crc kubenswrapper[5116]: E0322 00:10:03.523580 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:03 crc kubenswrapper[5116]: E0322 00:10:03.624398 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:03 crc kubenswrapper[5116]: E0322 00:10:03.725327 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:03 crc kubenswrapper[5116]: E0322 00:10:03.826253 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:03 crc kubenswrapper[5116]: E0322 00:10:03.927200 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:04 crc kubenswrapper[5116]: E0322 00:10:04.027331 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:04 crc kubenswrapper[5116]: E0322 00:10:04.127723 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:04 crc kubenswrapper[5116]: E0322 00:10:04.228403 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:04 crc kubenswrapper[5116]: E0322 00:10:04.328922 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:04 crc kubenswrapper[5116]: E0322 00:10:04.429540 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:04 crc kubenswrapper[5116]: E0322 00:10:04.529828 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:04 crc kubenswrapper[5116]: E0322 00:10:04.630664 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:04 crc kubenswrapper[5116]: E0322 00:10:04.731637 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:04 crc kubenswrapper[5116]: E0322 00:10:04.832576 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:04 crc kubenswrapper[5116]: E0322 00:10:04.933713 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:05 crc kubenswrapper[5116]: E0322 00:10:05.034076 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:05 crc kubenswrapper[5116]: E0322 00:10:05.134472 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:05 crc kubenswrapper[5116]: E0322 00:10:05.235638 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:05 crc kubenswrapper[5116]: E0322 00:10:05.336444 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:05 crc kubenswrapper[5116]: E0322 00:10:05.437359 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:05 crc kubenswrapper[5116]: E0322 00:10:05.537515 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:05 crc kubenswrapper[5116]: E0322 00:10:05.638024 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:05 crc kubenswrapper[5116]: E0322 00:10:05.739269 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:05 crc kubenswrapper[5116]: E0322 00:10:05.839913 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:05 crc kubenswrapper[5116]: E0322 00:10:05.940853 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:06 crc kubenswrapper[5116]: E0322 00:10:06.041942 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:06 crc kubenswrapper[5116]: E0322 00:10:06.142297 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:06 crc kubenswrapper[5116]: E0322 00:10:06.242474 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:06 crc kubenswrapper[5116]: E0322 00:10:06.342941 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:06 crc kubenswrapper[5116]: E0322 00:10:06.443960 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:06 crc kubenswrapper[5116]: E0322 00:10:06.544599 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:06 crc kubenswrapper[5116]: E0322 00:10:06.645249 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:06 crc kubenswrapper[5116]: E0322 00:10:06.745846 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:06 crc kubenswrapper[5116]: E0322 00:10:06.846723 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:06 crc kubenswrapper[5116]: E0322 00:10:06.947901 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:07 crc kubenswrapper[5116]: E0322 00:10:07.048778 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:07 crc kubenswrapper[5116]: E0322 00:10:07.148933 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:07 crc kubenswrapper[5116]: E0322 00:10:07.249818 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:07 crc kubenswrapper[5116]: E0322 00:10:07.350560 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:07 crc kubenswrapper[5116]: E0322 00:10:07.450688 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:07 crc kubenswrapper[5116]: E0322 00:10:07.551214 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:07 crc kubenswrapper[5116]: E0322 00:10:07.622984 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.627613 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.627648 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.627660 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.627675 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.627684 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:07Z","lastTransitionTime":"2026-03-22T00:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:07 crc kubenswrapper[5116]: E0322 00:10:07.640542 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400456Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861256Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e17d39b-4bf4-4f5d-b01b-aaffc38eb890\\\",\\\"systemUUID\\\":\\\"6f1c9f09-bd93-4412-afb3-903004a8bcf7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.645399 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.645435 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.645444 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.645458 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.645467 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:07Z","lastTransitionTime":"2026-03-22T00:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:07 crc kubenswrapper[5116]: E0322 00:10:07.658189 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400456Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861256Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e17d39b-4bf4-4f5d-b01b-aaffc38eb890\\\",\\\"systemUUID\\\":\\\"6f1c9f09-bd93-4412-afb3-903004a8bcf7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.661496 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.661563 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.661575 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.661591 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.661601 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:07Z","lastTransitionTime":"2026-03-22T00:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:07 crc kubenswrapper[5116]: E0322 00:10:07.671063 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400456Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861256Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e17d39b-4bf4-4f5d-b01b-aaffc38eb890\\\",\\\"systemUUID\\\":\\\"6f1c9f09-bd93-4412-afb3-903004a8bcf7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.677152 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.677222 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.677234 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.677250 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.677263 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:07Z","lastTransitionTime":"2026-03-22T00:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:07 crc kubenswrapper[5116]: E0322 00:10:07.687792 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400456Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861256Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e17d39b-4bf4-4f5d-b01b-aaffc38eb890\\\",\\\"systemUUID\\\":\\\"6f1c9f09-bd93-4412-afb3-903004a8bcf7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:07 crc kubenswrapper[5116]: E0322 00:10:07.687906 5116 kubelet_node_status.go:584] "Unable to update node status" err="update node status exceeds retry count" Mar 22 00:10:07 crc kubenswrapper[5116]: E0322 00:10:07.687933 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:07 crc kubenswrapper[5116]: E0322 00:10:07.788902 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:07 crc kubenswrapper[5116]: E0322 00:10:07.889537 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:07 crc kubenswrapper[5116]: E0322 00:10:07.989640 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:08 crc kubenswrapper[5116]: E0322 00:10:08.089727 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:08 crc kubenswrapper[5116]: E0322 00:10:08.190220 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:08 crc kubenswrapper[5116]: E0322 00:10:08.291303 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:08 crc kubenswrapper[5116]: E0322 00:10:08.392528 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:08 crc kubenswrapper[5116]: E0322 00:10:08.493591 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:08 crc kubenswrapper[5116]: E0322 00:10:08.594507 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:08 crc kubenswrapper[5116]: E0322 00:10:08.694921 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:08 crc kubenswrapper[5116]: E0322 00:10:08.795307 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:08 crc kubenswrapper[5116]: E0322 00:10:08.895707 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:08 crc kubenswrapper[5116]: I0322 00:10:08.958717 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:10:08 crc kubenswrapper[5116]: I0322 00:10:08.959033 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:10:08 crc kubenswrapper[5116]: I0322 00:10:08.960127 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:08 crc kubenswrapper[5116]: I0322 00:10:08.960217 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:08 crc kubenswrapper[5116]: I0322 00:10:08.960236 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:08 crc kubenswrapper[5116]: E0322 00:10:08.960796 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:10:08 crc kubenswrapper[5116]: I0322 00:10:08.961105 5116 scope.go:117] "RemoveContainer" containerID="4ec1f0e4053fa1e136a94ad86e588cc0fd43b29333734b120fe3d6175c1913a8" Mar 22 00:10:08 crc kubenswrapper[5116]: E0322 00:10:08.961407 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" Mar 22 00:10:08 crc kubenswrapper[5116]: E0322 00:10:08.996816 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:09 crc kubenswrapper[5116]: E0322 00:10:09.097387 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:09 crc kubenswrapper[5116]: E0322 00:10:09.197974 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:09 crc kubenswrapper[5116]: E0322 00:10:09.298376 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:09 crc kubenswrapper[5116]: E0322 00:10:09.398755 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:09 crc kubenswrapper[5116]: E0322 00:10:09.499865 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:09 crc kubenswrapper[5116]: E0322 00:10:09.600776 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:09 crc kubenswrapper[5116]: E0322 00:10:09.701911 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:09 crc kubenswrapper[5116]: E0322 00:10:09.742322 5116 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 22 00:10:09 crc kubenswrapper[5116]: E0322 00:10:09.802663 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:09 crc kubenswrapper[5116]: E0322 00:10:09.902878 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:10 crc kubenswrapper[5116]: E0322 00:10:10.003558 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:10 crc kubenswrapper[5116]: E0322 00:10:10.104053 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:10 crc kubenswrapper[5116]: E0322 00:10:10.205231 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:10 crc kubenswrapper[5116]: E0322 00:10:10.306343 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:10 crc kubenswrapper[5116]: E0322 00:10:10.406743 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:10 crc kubenswrapper[5116]: E0322 00:10:10.507052 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:10 crc kubenswrapper[5116]: E0322 00:10:10.607590 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:10 crc kubenswrapper[5116]: E0322 00:10:10.708534 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:10 crc kubenswrapper[5116]: E0322 00:10:10.809183 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:10 crc kubenswrapper[5116]: E0322 00:10:10.909372 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:11 crc kubenswrapper[5116]: E0322 00:10:11.009892 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:11 crc kubenswrapper[5116]: E0322 00:10:11.110345 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:11 crc kubenswrapper[5116]: E0322 00:10:11.210599 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:11 crc kubenswrapper[5116]: E0322 00:10:11.311249 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:11 crc kubenswrapper[5116]: E0322 00:10:11.411943 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:11 crc kubenswrapper[5116]: E0322 00:10:11.512837 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:11 crc kubenswrapper[5116]: E0322 00:10:11.613553 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:11 crc kubenswrapper[5116]: E0322 00:10:11.714730 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:11 crc kubenswrapper[5116]: E0322 00:10:11.815949 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:11 crc kubenswrapper[5116]: E0322 00:10:11.916828 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:12 crc kubenswrapper[5116]: E0322 00:10:12.017262 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:12 crc kubenswrapper[5116]: E0322 00:10:12.118253 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:12 crc kubenswrapper[5116]: E0322 00:10:12.218971 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:12 crc kubenswrapper[5116]: E0322 00:10:12.319216 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:12 crc kubenswrapper[5116]: E0322 00:10:12.420448 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:12 crc kubenswrapper[5116]: E0322 00:10:12.520598 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:12 crc kubenswrapper[5116]: E0322 00:10:12.621259 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:12 crc kubenswrapper[5116]: E0322 00:10:12.721779 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:12 crc kubenswrapper[5116]: E0322 00:10:12.822528 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:12 crc kubenswrapper[5116]: E0322 00:10:12.923368 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:13 crc kubenswrapper[5116]: E0322 00:10:13.024398 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:13 crc kubenswrapper[5116]: E0322 00:10:13.125315 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:13 crc kubenswrapper[5116]: E0322 00:10:13.225424 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:13 crc kubenswrapper[5116]: E0322 00:10:13.326512 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:13 crc kubenswrapper[5116]: E0322 00:10:13.427343 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:13 crc kubenswrapper[5116]: E0322 00:10:13.528518 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:13 crc kubenswrapper[5116]: E0322 00:10:13.629695 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:13 crc kubenswrapper[5116]: E0322 00:10:13.730573 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:13 crc kubenswrapper[5116]: E0322 00:10:13.831727 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:13 crc kubenswrapper[5116]: E0322 00:10:13.932769 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:14 crc kubenswrapper[5116]: E0322 00:10:14.033570 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:14 crc kubenswrapper[5116]: E0322 00:10:14.134644 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:14 crc kubenswrapper[5116]: E0322 00:10:14.235010 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:14 crc kubenswrapper[5116]: E0322 00:10:14.335529 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:14 crc kubenswrapper[5116]: E0322 00:10:14.436542 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:14 crc kubenswrapper[5116]: E0322 00:10:14.537022 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:14 crc kubenswrapper[5116]: E0322 00:10:14.638075 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:14 crc kubenswrapper[5116]: E0322 00:10:14.739156 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:14 crc kubenswrapper[5116]: E0322 00:10:14.840271 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:14 crc kubenswrapper[5116]: I0322 00:10:14.886336 5116 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Mar 22 00:10:14 crc kubenswrapper[5116]: E0322 00:10:14.940946 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:15 crc kubenswrapper[5116]: E0322 00:10:15.041988 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:15 crc kubenswrapper[5116]: E0322 00:10:15.142447 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:15 crc kubenswrapper[5116]: E0322 00:10:15.243209 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:15 crc kubenswrapper[5116]: E0322 00:10:15.343850 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:15 crc kubenswrapper[5116]: E0322 00:10:15.443969 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:15 crc kubenswrapper[5116]: E0322 00:10:15.544544 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:15 crc kubenswrapper[5116]: E0322 00:10:15.645642 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:15 crc kubenswrapper[5116]: I0322 00:10:15.696772 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:10:15 crc kubenswrapper[5116]: I0322 00:10:15.697981 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:15 crc kubenswrapper[5116]: I0322 00:10:15.698043 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:15 crc kubenswrapper[5116]: I0322 00:10:15.698057 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:15 crc kubenswrapper[5116]: E0322 00:10:15.698736 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:10:15 crc kubenswrapper[5116]: E0322 00:10:15.746024 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:15 crc kubenswrapper[5116]: E0322 00:10:15.846383 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:15 crc kubenswrapper[5116]: E0322 00:10:15.947423 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:16 crc kubenswrapper[5116]: E0322 00:10:16.047587 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:16 crc kubenswrapper[5116]: E0322 00:10:16.148688 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:16 crc kubenswrapper[5116]: E0322 00:10:16.249647 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:16 crc kubenswrapper[5116]: E0322 00:10:16.350652 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:16 crc kubenswrapper[5116]: E0322 00:10:16.451211 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:16 crc kubenswrapper[5116]: E0322 00:10:16.552299 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:16 crc kubenswrapper[5116]: E0322 00:10:16.653404 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:16 crc kubenswrapper[5116]: E0322 00:10:16.754543 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:16 crc kubenswrapper[5116]: E0322 00:10:16.855773 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:16 crc kubenswrapper[5116]: E0322 00:10:16.956120 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:17 crc kubenswrapper[5116]: E0322 00:10:17.056338 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:17 crc kubenswrapper[5116]: E0322 00:10:17.157003 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:17 crc kubenswrapper[5116]: E0322 00:10:17.258047 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:17 crc kubenswrapper[5116]: E0322 00:10:17.358236 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:17 crc kubenswrapper[5116]: E0322 00:10:17.459007 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:17 crc kubenswrapper[5116]: E0322 00:10:17.560084 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:17 crc kubenswrapper[5116]: E0322 00:10:17.660696 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:17 crc kubenswrapper[5116]: E0322 00:10:17.761149 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:17 crc kubenswrapper[5116]: E0322 00:10:17.862208 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:17 crc kubenswrapper[5116]: E0322 00:10:17.962578 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:18 crc kubenswrapper[5116]: E0322 00:10:18.009259 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.014794 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.015149 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.015319 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.015485 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.015648 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:18Z","lastTransitionTime":"2026-03-22T00:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:18 crc kubenswrapper[5116]: E0322 00:10:18.024611 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400456Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861256Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e17d39b-4bf4-4f5d-b01b-aaffc38eb890\\\",\\\"systemUUID\\\":\\\"6f1c9f09-bd93-4412-afb3-903004a8bcf7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.032405 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.032507 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.032527 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.032583 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.032602 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:18Z","lastTransitionTime":"2026-03-22T00:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:18 crc kubenswrapper[5116]: E0322 00:10:18.047906 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400456Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861256Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e17d39b-4bf4-4f5d-b01b-aaffc38eb890\\\",\\\"systemUUID\\\":\\\"6f1c9f09-bd93-4412-afb3-903004a8bcf7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.051286 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.051349 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.051369 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.051394 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.051411 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:18Z","lastTransitionTime":"2026-03-22T00:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:18 crc kubenswrapper[5116]: E0322 00:10:18.061070 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400456Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861256Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e17d39b-4bf4-4f5d-b01b-aaffc38eb890\\\",\\\"systemUUID\\\":\\\"6f1c9f09-bd93-4412-afb3-903004a8bcf7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.063908 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.063952 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.063966 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.063983 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.063994 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:18Z","lastTransitionTime":"2026-03-22T00:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:18 crc kubenswrapper[5116]: E0322 00:10:18.072994 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400456Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861256Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e17d39b-4bf4-4f5d-b01b-aaffc38eb890\\\",\\\"systemUUID\\\":\\\"6f1c9f09-bd93-4412-afb3-903004a8bcf7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:18 crc kubenswrapper[5116]: E0322 00:10:18.073246 5116 kubelet_node_status.go:584] "Unable to update node status" err="update node status exceeds retry count" Mar 22 00:10:18 crc kubenswrapper[5116]: E0322 00:10:18.073299 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:18 crc kubenswrapper[5116]: E0322 00:10:18.174215 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:18 crc kubenswrapper[5116]: E0322 00:10:18.275160 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:18 crc kubenswrapper[5116]: E0322 00:10:18.375975 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:18 crc kubenswrapper[5116]: E0322 00:10:18.476555 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:18 crc kubenswrapper[5116]: E0322 00:10:18.576691 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:18 crc kubenswrapper[5116]: E0322 00:10:18.676994 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:18 crc kubenswrapper[5116]: E0322 00:10:18.778281 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:18 crc kubenswrapper[5116]: E0322 00:10:18.878697 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:18 crc kubenswrapper[5116]: E0322 00:10:18.979490 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:19 crc kubenswrapper[5116]: E0322 00:10:19.079647 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:19 crc kubenswrapper[5116]: E0322 00:10:19.180115 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:19 crc kubenswrapper[5116]: E0322 00:10:19.281065 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:19 crc kubenswrapper[5116]: E0322 00:10:19.382197 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:19 crc kubenswrapper[5116]: E0322 00:10:19.483113 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:19 crc kubenswrapper[5116]: E0322 00:10:19.583923 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:19 crc kubenswrapper[5116]: E0322 00:10:19.684244 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:19 crc kubenswrapper[5116]: E0322 00:10:19.743358 5116 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 22 00:10:19 crc kubenswrapper[5116]: E0322 00:10:19.784620 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:19 crc kubenswrapper[5116]: E0322 00:10:19.885244 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:19 crc kubenswrapper[5116]: E0322 00:10:19.986341 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:20 crc kubenswrapper[5116]: E0322 00:10:20.087381 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:20 crc kubenswrapper[5116]: E0322 00:10:20.187743 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:20 crc kubenswrapper[5116]: E0322 00:10:20.287917 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:20 crc kubenswrapper[5116]: E0322 00:10:20.388346 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:20 crc kubenswrapper[5116]: E0322 00:10:20.489256 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:20 crc kubenswrapper[5116]: E0322 00:10:20.589904 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:20 crc kubenswrapper[5116]: E0322 00:10:20.690549 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:20 crc kubenswrapper[5116]: E0322 00:10:20.791076 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:20 crc kubenswrapper[5116]: E0322 00:10:20.891889 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:20 crc kubenswrapper[5116]: E0322 00:10:20.992985 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:21 crc kubenswrapper[5116]: E0322 00:10:21.094120 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:21 crc kubenswrapper[5116]: E0322 00:10:21.194959 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:21 crc kubenswrapper[5116]: E0322 00:10:21.295942 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:21 crc kubenswrapper[5116]: E0322 00:10:21.396838 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:21 crc kubenswrapper[5116]: E0322 00:10:21.497661 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:21 crc kubenswrapper[5116]: E0322 00:10:21.597829 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:21 crc kubenswrapper[5116]: E0322 00:10:21.698183 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:21 crc kubenswrapper[5116]: E0322 00:10:21.798673 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:21 crc kubenswrapper[5116]: I0322 00:10:21.831045 5116 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Mar 22 00:10:21 crc kubenswrapper[5116]: I0322 00:10:21.901820 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:21 crc kubenswrapper[5116]: I0322 00:10:21.901897 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:21 crc kubenswrapper[5116]: I0322 00:10:21.901911 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:21 crc kubenswrapper[5116]: I0322 00:10:21.901936 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:21 crc kubenswrapper[5116]: I0322 00:10:21.901953 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:21Z","lastTransitionTime":"2026-03-22T00:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:21 crc kubenswrapper[5116]: I0322 00:10:21.918965 5116 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 22 00:10:21 crc kubenswrapper[5116]: I0322 00:10:21.932900 5116 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 22 00:10:21 crc kubenswrapper[5116]: I0322 00:10:21.945379 5116 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-etcd/etcd-crc" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.004399 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.004470 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.004485 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.004511 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.004531 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:22Z","lastTransitionTime":"2026-03-22T00:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.047853 5116 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.106992 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.107036 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.107045 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.107061 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.107072 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:22Z","lastTransitionTime":"2026-03-22T00:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.146071 5116 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.209588 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.209638 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.209651 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.209668 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.209680 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:22Z","lastTransitionTime":"2026-03-22T00:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.311914 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.311963 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.311976 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.311994 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.312009 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:22Z","lastTransitionTime":"2026-03-22T00:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.413743 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.413817 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.413842 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.413873 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.413896 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:22Z","lastTransitionTime":"2026-03-22T00:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.516449 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.516499 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.516515 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.516531 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.516543 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:22Z","lastTransitionTime":"2026-03-22T00:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.620412 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.620469 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.620484 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.620505 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.620521 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:22Z","lastTransitionTime":"2026-03-22T00:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.628629 5116 apiserver.go:52] "Watching apiserver" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.635349 5116 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.635917 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-multus/multus-additional-cni-plugins-bk75f","openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5","openshift-kube-apiserver/kube-apiserver-crc","openshift-multus/multus-9sq6c","openshift-multus/network-metrics-daemon-wlq8c","openshift-network-operator/iptables-alerter-5jnd7","openshift-etcd/etcd-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-machine-config-operator/machine-config-daemon-66g6d","openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6","openshift-network-diagnostics/network-check-target-fhkjl","openshift-network-node-identity/network-node-identity-dgvkt","openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv","openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4","openshift-image-registry/node-ca-2rwjp","openshift-ovn-kubernetes/ovnkube-node-n9zvq","openshift-dns/node-resolver-nwnjb","openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.637154 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.641859 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.641980 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.644529 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.644762 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.644546 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-dgvkt" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.645595 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5jnd7" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.645827 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"metrics-tls\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.646457 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-node-identity\"/\"kube-root-ca.crt\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.646699 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.646828 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.647004 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-node-identity\"/\"ovnkube-identity-cm\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.647136 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-node-identity\"/\"network-node-identity-cert\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.648843 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.648865 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-node-identity\"/\"env-overrides\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.649227 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-node-identity\"/\"openshift-service-ca.crt\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.663092 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34177974-8d82-49d2-a763-391d0df3bbd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-7bdcf4f5bd-7fjxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.676292 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-5jnd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dsgwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-5jnd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.680579 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.680667 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.680709 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-nwnjb" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.683011 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-tk7bt\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.683628 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.683854 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.688555 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34177974-8d82-49d2-a763-391d0df3bbd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-7bdcf4f5bd-7fjxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.699093 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.701641 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.701642 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.701840 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.701719 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.702218 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-g6kgg\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.704411 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17b87002-b798-480a-8e17-83053d698239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwt8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-fhkjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.716510 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f863fff9-286a-45fa-b8f0-8a86994b8440\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7w75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-5bb8f5cd97-xdvz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.723408 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.723466 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.723479 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.723499 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.723512 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:22Z","lastTransitionTime":"2026-03-22T00:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.728415 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc4541ce-7789-4670-bc75-5c2868e52ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-dgvkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.728677 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.731538 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-config-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.731628 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"proxy-tls\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.731735 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-config-operator\"/\"kube-root-ca.crt\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.731759 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-config-operator\"/\"kube-rbac-proxy\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.731787 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-daemon-dockercfg-w9nzh\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.737294 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.737360 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.737679 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlq8c" podUID="94c19a90-c2c9-4236-98be-a0516dbb840b" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.738692 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-control-plane-dockercfg-nl8tp\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.738994 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-control-plane-metrics-cert\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.739216 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.739347 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.739620 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.741852 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34177974-8d82-49d2-a763-391d0df3bbd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-7bdcf4f5bd-7fjxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.741992 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.744671 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2rwjp" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.748010 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.748073 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.748089 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.748735 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-tjs74\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.749069 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.751470 5116 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.752068 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.752312 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.752350 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-l2v2m\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.755541 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.756723 5116 scope.go:117] "RemoveContainer" containerID="4ec1f0e4053fa1e136a94ad86e588cc0fd43b29333734b120fe3d6175c1913a8" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.756923 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17b87002-b798-480a-8e17-83053d698239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwt8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-fhkjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.757054 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.759921 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.759966 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-nwglk\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.760783 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.768744 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f863fff9-286a-45fa-b8f0-8a86994b8440\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7w75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-5bb8f5cd97-xdvz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.780456 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d36c245b-3d7f-48eb-848e-c54198ae38a4-hosts-file\") pod \"node-resolver-nwnjb\" (UID: \"d36c245b-3d7f-48eb-848e-c54198ae38a4\") " pod="openshift-dns/node-resolver-nwnjb" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.780516 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-os-release\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.780540 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-etc-kubernetes\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.780570 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-var-lib-kubelet\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.780597 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3-rootfs\") pod \"machine-config-daemon-66g6d\" (UID: \"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3\") " pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.780626 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5zcr\" (UniqueName: \"kubernetes.io/projected/d36c245b-3d7f-48eb-848e-c54198ae38a4-kube-api-access-x5zcr\") pod \"node-resolver-nwnjb\" (UID: \"d36c245b-3d7f-48eb-848e-c54198ae38a4\") " pod="openshift-dns/node-resolver-nwnjb" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.780663 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3-mcd-auth-proxy-config\") pod \"machine-config-daemon-66g6d\" (UID: \"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3\") " pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.780740 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d36c245b-3d7f-48eb-848e-c54198ae38a4-tmp-dir\") pod \"node-resolver-nwnjb\" (UID: \"d36c245b-3d7f-48eb-848e-c54198ae38a4\") " pod="openshift-dns/node-resolver-nwnjb" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.780770 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-cnibin\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.780798 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-multus-socket-dir-parent\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.780847 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7w75\" (UniqueName: \"kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75\") pod \"network-check-source-5bb8f5cd97-xdvz5\" (UID: \"f863fff9-286a-45fa-b8f0-8a86994b8440\") " pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.780874 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fc4541ce-7789-4670-bc75-5c2868e52ce0-webhook-cert\") pod \"network-node-identity-dgvkt\" (UID: \"fc4541ce-7789-4670-bc75-5c2868e52ce0\") " pod="openshift-network-node-identity/network-node-identity-dgvkt" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.781058 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc4541ce-7789-4670-bc75-5c2868e52ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-dgvkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.781399 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fc4541ce-7789-4670-bc75-5c2868e52ce0-env-overrides\") pod \"network-node-identity-dgvkt\" (UID: \"fc4541ce-7789-4670-bc75-5c2868e52ce0\") " pod="openshift-network-node-identity/network-node-identity-dgvkt" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.781616 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-run-k8s-cni-cncf-io\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.781651 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-run-netns\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.781675 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3-proxy-tls\") pod \"machine-config-daemon-66g6d\" (UID: \"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3\") " pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.781698 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nt2j\" (UniqueName: \"kubernetes.io/projected/fc4541ce-7789-4670-bc75-5c2868e52ce0-kube-api-access-8nt2j\") pod \"network-node-identity-dgvkt\" (UID: \"fc4541ce-7789-4670-bc75-5c2868e52ce0\") " pod="openshift-network-node-identity/network-node-identity-dgvkt" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.781717 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7xz2\" (UniqueName: \"kubernetes.io/projected/34177974-8d82-49d2-a763-391d0df3bbd8-kube-api-access-m7xz2\") pod \"network-operator-7bdcf4f5bd-7fjxv\" (UID: \"34177974-8d82-49d2-a763-391d0df3bbd8\") " pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782022 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782053 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-system-cni-dir\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782073 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34177974-8d82-49d2-a763-391d0df3bbd8-metrics-tls\") pod \"network-operator-7bdcf4f5bd-7fjxv\" (UID: \"34177974-8d82-49d2-a763-391d0df3bbd8\") " pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782093 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782112 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-var-lib-cni-multus\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782128 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5188f25b-37c3-46f1-b939-199c6e082848-multus-daemon-config\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782145 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-multus-conf-dir\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782180 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-multus-cni-dir\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782211 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-var-lib-cni-bin\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782230 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-hostroot\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782248 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-run-multus-certs\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782263 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqrqs\" (UniqueName: \"kubernetes.io/projected/5188f25b-37c3-46f1-b939-199c6e082848-kube-api-access-sqrqs\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782285 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwt8b\" (UniqueName: \"kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b\") pod \"network-check-target-fhkjl\" (UID: \"17b87002-b798-480a-8e17-83053d698239\") " pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782305 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pkhq\" (UniqueName: \"kubernetes.io/projected/9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3-kube-api-access-5pkhq\") pod \"machine-config-daemon-66g6d\" (UID: \"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3\") " pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782329 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/fc4541ce-7789-4670-bc75-5c2868e52ce0-ovnkube-identity-cm\") pod \"network-node-identity-dgvkt\" (UID: \"fc4541ce-7789-4670-bc75-5c2868e52ce0\") " pod="openshift-network-node-identity/network-node-identity-dgvkt" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782348 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5188f25b-37c3-46f1-b939-199c6e082848-cni-binary-copy\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782368 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-iptables-alerter-script\") pod \"iptables-alerter-5jnd7\" (UID: \"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\") " pod="openshift-network-operator/iptables-alerter-5jnd7" Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.782378 5116 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.782543 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:23.282504674 +0000 UTC m=+94.304806047 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.783469 5116 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782392 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dsgwk\" (UniqueName: \"kubernetes.io/projected/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-kube-api-access-dsgwk\") pod \"iptables-alerter-5jnd7\" (UID: \"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\") " pod="openshift-network-operator/iptables-alerter-5jnd7" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.783640 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-host-slash\") pod \"iptables-alerter-5jnd7\" (UID: \"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\") " pod="openshift-network-operator/iptables-alerter-5jnd7" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.783673 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/34177974-8d82-49d2-a763-391d0df3bbd8-host-etc-kube\") pod \"network-operator-7bdcf4f5bd-7fjxv\" (UID: \"34177974-8d82-49d2-a763-391d0df3bbd8\") " pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.783785 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:23.283755015 +0000 UTC m=+94.306056388 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.784198 5116 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.784946 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fc4541ce-7789-4670-bc75-5c2868e52ce0-env-overrides\") pod \"network-node-identity-dgvkt\" (UID: \"fc4541ce-7789-4670-bc75-5c2868e52ce0\") " pod="openshift-network-node-identity/network-node-identity-dgvkt" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.785875 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-iptables-alerter-script\") pod \"iptables-alerter-5jnd7\" (UID: \"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\") " pod="openshift-network-operator/iptables-alerter-5jnd7" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.785926 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/fc4541ce-7789-4670-bc75-5c2868e52ce0-ovnkube-identity-cm\") pod \"network-node-identity-dgvkt\" (UID: \"fc4541ce-7789-4670-bc75-5c2868e52ce0\") " pod="openshift-network-node-identity/network-node-identity-dgvkt" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.795223 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fc4541ce-7789-4670-bc75-5c2868e52ce0-webhook-cert\") pod \"network-node-identity-dgvkt\" (UID: \"fc4541ce-7789-4670-bc75-5c2868e52ce0\") " pod="openshift-network-node-identity/network-node-identity-dgvkt" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.795600 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fbdfe828b092b23e6d4480daf3e0216aada6debaf1ef1b314a0a31e73ebf13c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-5ff7774fd9-nljh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.797706 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34177974-8d82-49d2-a763-391d0df3bbd8-metrics-tls\") pod \"network-operator-7bdcf4f5bd-7fjxv\" (UID: \"34177974-8d82-49d2-a763-391d0df3bbd8\") " pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.801639 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7xz2\" (UniqueName: \"kubernetes.io/projected/34177974-8d82-49d2-a763-391d0df3bbd8-kube-api-access-m7xz2\") pod \"network-operator-7bdcf4f5bd-7fjxv\" (UID: \"34177974-8d82-49d2-a763-391d0df3bbd8\") " pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.802216 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsgwk\" (UniqueName: \"kubernetes.io/projected/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-kube-api-access-dsgwk\") pod \"iptables-alerter-5jnd7\" (UID: \"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\") " pod="openshift-network-operator/iptables-alerter-5jnd7" Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.803286 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.803312 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.803324 5116 projected.go:194] Error preparing data for projected volume kube-api-access-l7w75 for pod openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.803340 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.803364 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.803380 5116 projected.go:194] Error preparing data for projected volume kube-api-access-gwt8b for pod openshift-network-diagnostics/network-check-target-fhkjl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.803409 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75 podName:f863fff9-286a-45fa-b8f0-8a86994b8440 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:23.303389678 +0000 UTC m=+94.325691051 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-l7w75" (UniqueName: "kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75") pod "network-check-source-5bb8f5cd97-xdvz5" (UID: "f863fff9-286a-45fa-b8f0-8a86994b8440") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.803447 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b podName:17b87002-b798-480a-8e17-83053d698239 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:23.303424359 +0000 UTC m=+94.325725732 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-gwt8b" (UniqueName: "kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b") pod "network-check-target-fhkjl" (UID: "17b87002-b798-480a-8e17-83053d698239") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.804501 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nt2j\" (UniqueName: \"kubernetes.io/projected/fc4541ce-7789-4670-bc75-5c2868e52ce0-kube-api-access-8nt2j\") pod \"network-node-identity-dgvkt\" (UID: \"fc4541ce-7789-4670-bc75-5c2868e52ce0\") " pod="openshift-network-node-identity/network-node-identity-dgvkt" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.807430 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-5jnd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dsgwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-5jnd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.815914 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwnjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d36c245b-3d7f-48eb-848e-c54198ae38a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwnjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.820680 5116 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.824531 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/multus-9sq6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5188f25b-37c3-46f1-b939-199c6e082848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9sq6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.826639 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.826684 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.826697 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.826715 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.826728 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:22Z","lastTransitionTime":"2026-03-22T00:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.838574 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bbc05d-2ba5-48f3-8d94-3fcd0c0f12d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"},\\\"containerID\\\":\\\"cri-o://ad6827aa53ec071d573f2851cceb4fb83950ed3acf710fdb8da0db4f5143d5e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-bundle-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://8b5eb8790cbf9c748b6973a9f5ce75637e41f035ed3bdd5eda970498f8d57bdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://d4b3378494f7debf7a1df1eb70ba9f5b687fd897b206a12e1eb127da23e5830b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://4ec1f0e4053fa1e136a94ad86e588cc0fd43b29333734b120fe3d6175c1913a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ec1f0e4053fa1e136a94ad86e588cc0fd43b29333734b120fe3d6175c1913a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-22T00:09:59Z\\\",\\\"message\\\":\\\"ar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"ClientsPreferCBOR\\\\\\\" enabled=false\\\\nW0322 00:09:59.459991 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0322 00:09:59.460237 1 builder.go:304] check-endpoints version v0.0.0-unknown-c3d9642-c3d9642\\\\nI0322 00:09:59.461209 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2974540120/tls.crt::/tmp/serving-cert-2974540120/tls.key\\\\\\\"\\\\nI0322 00:09:59.961022 1 requestheader_controller.go:255] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0322 00:09:59.962668 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0322 00:09:59.962684 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0322 00:09:59.962706 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0322 00:09:59.962712 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0322 00:09:59.967903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0322 00:09:59.967930 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0322 00:09:59.967936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0322 00:09:59.967945 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0322 00:09:59.967951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0322 00:09:59.967956 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0322 00:09:59.967963 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0322 00:09:59.967953 1 genericapiserver.go:546] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0322 00:09:59.969056 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-22T00:09:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://b24be2af5c7a78bb1d324b802332ff3620ee459e00164b6574221c5186689456\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://15a3fa2ea4a685791c1b819448d6d0952ea97b7caf1a51b4250746e17e743cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15a3fa2ea4a685791c1b819448d6d0952ea97b7caf1a51b4250746e17e743cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:50Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:08:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.850440 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-5jnd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dsgwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-5jnd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.861453 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pkhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pkhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-66g6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.879192 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec484e57-1508-45a3-99a3-51dfa8ef6195\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n9zvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.885671 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-tmpfs\") pod \"7fcc6409-8a0f-44c3-89e7-5aecd7610f8a\" (UID: \"7fcc6409-8a0f-44c3-89e7-5aecd7610f8a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.885736 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/92dfbade-90b6-4169-8c07-72cff7f2c82b-metrics-tls\") pod \"92dfbade-90b6-4169-8c07-72cff7f2c82b\" (UID: \"92dfbade-90b6-4169-8c07-72cff7f2c82b\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.885762 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9f71a554-e414-4bc3-96d2-674060397afe-bound-sa-token\") pod \"9f71a554-e414-4bc3-96d2-674060397afe\" (UID: \"9f71a554-e414-4bc3-96d2-674060397afe\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.885790 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-bound-sa-token\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.885813 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/301e1965-1754-483d-b6cc-bfae7038bbca-profile-collector-cert\") pod \"301e1965-1754-483d-b6cc-bfae7038bbca\" (UID: \"301e1965-1754-483d-b6cc-bfae7038bbca\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.886002 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b605f283-6f2e-42da-a838-54421690f7d0-catalog-content\") pod \"b605f283-6f2e-42da-a838-54421690f7d0\" (UID: \"b605f283-6f2e-42da-a838-54421690f7d0\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.886052 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-service-ca-bundle\") pod \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\" (UID: \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.886082 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-router-certs\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.886104 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9stx\" (UniqueName: \"kubernetes.io/projected/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-kube-api-access-l9stx\") pod \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\" (UID: \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.886129 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c5f2bfad-70f6-4185-a3d9-81ce12720767-tmp-dir\") pod \"c5f2bfad-70f6-4185-a3d9-81ce12720767\" (UID: \"c5f2bfad-70f6-4185-a3d9-81ce12720767\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.886150 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-config\") pod \"fc8db2c7-859d-47b3-a900-2bd0c0b2973b\" (UID: \"fc8db2c7-859d-47b3-a900-2bd0c0b2973b\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.886186 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-trusted-ca-bundle\") pod \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\" (UID: \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.886213 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rmnv\" (UniqueName: \"kubernetes.io/projected/b605f283-6f2e-42da-a838-54421690f7d0-kube-api-access-6rmnv\") pod \"b605f283-6f2e-42da-a838-54421690f7d0\" (UID: \"b605f283-6f2e-42da-a838-54421690f7d0\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.886235 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7afa918d-be67-40a6-803c-d3b0ae99d815-tmp\") pod \"7afa918d-be67-40a6-803c-d3b0ae99d815\" (UID: \"7afa918d-be67-40a6-803c-d3b0ae99d815\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.886274 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7afa918d-be67-40a6-803c-d3b0ae99d815-serving-cert\") pod \"7afa918d-be67-40a6-803c-d3b0ae99d815\" (UID: \"7afa918d-be67-40a6-803c-d3b0ae99d815\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.886301 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/81e39f7b-62e4-4fc9-992a-6535ce127a02-multus-daemon-config\") pod \"81e39f7b-62e4-4fc9-992a-6535ce127a02\" (UID: \"81e39f7b-62e4-4fc9-992a-6535ce127a02\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.886323 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9e9b5059-1b3e-4067-a63d-2952cbe863af-registry-certificates\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.886500 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" (UID: "7fcc6409-8a0f-44c3-89e7-5aecd7610f8a"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.886749 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b605f283-6f2e-42da-a838-54421690f7d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b605f283-6f2e-42da-a838-54421690f7d0" (UID: "b605f283-6f2e-42da-a838-54421690f7d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.887349 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/301e1965-1754-483d-b6cc-bfae7038bbca-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "301e1965-1754-483d-b6cc-bfae7038bbca" (UID: "301e1965-1754-483d-b6cc-bfae7038bbca"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.887375 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.887530 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5f2bfad-70f6-4185-a3d9-81ce12720767-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "c5f2bfad-70f6-4185-a3d9-81ce12720767" (UID: "c5f2bfad-70f6-4185-a3d9-81ce12720767"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.887687 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7afa918d-be67-40a6-803c-d3b0ae99d815-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7afa918d-be67-40a6-803c-d3b0ae99d815" (UID: "7afa918d-be67-40a6-803c-d3b0ae99d815"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.887762 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7afa918d-be67-40a6-803c-d3b0ae99d815-tmp" (OuterVolumeSpecName: "tmp") pod "7afa918d-be67-40a6-803c-d3b0ae99d815" (UID: "7afa918d-be67-40a6-803c-d3b0ae99d815"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.887724 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-config" (OuterVolumeSpecName: "config") pod "fc8db2c7-859d-47b3-a900-2bd0c0b2973b" (UID: "fc8db2c7-859d-47b3-a900-2bd0c0b2973b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.887816 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16bdd140-dce1-464c-ab47-dd5798d1d256-serving-cert\") pod \"16bdd140-dce1-464c-ab47-dd5798d1d256\" (UID: \"16bdd140-dce1-464c-ab47-dd5798d1d256\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.887847 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-config\") pod \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\" (UID: \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.887876 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-encryption-config\") pod \"f559dfa3-3917-43a2-97f6-61ddfda10e93\" (UID: \"f559dfa3-3917-43a2-97f6-61ddfda10e93\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.887898 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-trusted-ca-bundle\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.887918 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks6v2\" (UniqueName: \"kubernetes.io/projected/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-kube-api-access-ks6v2\") pod \"71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a\" (UID: \"71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.887969 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c491984c-7d4b-44aa-8c1e-d7974424fa47-images\") pod \"c491984c-7d4b-44aa-8c1e-d7974424fa47\" (UID: \"c491984c-7d4b-44aa-8c1e-d7974424fa47\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.887988 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dmhf\" (UniqueName: \"kubernetes.io/projected/736c54fe-349c-4bb9-870a-d1c1d1c03831-kube-api-access-6dmhf\") pod \"736c54fe-349c-4bb9-870a-d1c1d1c03831\" (UID: \"736c54fe-349c-4bb9-870a-d1c1d1c03831\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888009 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/149b3c48-e17c-4a66-a835-d86dabf6ff13-utilities\") pod \"149b3c48-e17c-4a66-a835-d86dabf6ff13\" (UID: \"149b3c48-e17c-4a66-a835-d86dabf6ff13\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888027 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/301e1965-1754-483d-b6cc-bfae7038bbca-srv-cert\") pod \"301e1965-1754-483d-b6cc-bfae7038bbca\" (UID: \"301e1965-1754-483d-b6cc-bfae7038bbca\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888044 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkdh6\" (UniqueName: \"kubernetes.io/projected/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-kube-api-access-tkdh6\") pod \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\" (UID: \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888062 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nspp\" (UniqueName: \"kubernetes.io/projected/a7a88189-c967-4640-879e-27665747f20c-kube-api-access-8nspp\") pod \"a7a88189-c967-4640-879e-27665747f20c\" (UID: \"a7a88189-c967-4640-879e-27665747f20c\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888084 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pddnv\" (UniqueName: \"kubernetes.io/projected/e093be35-bb62-4843-b2e8-094545761610-kube-api-access-pddnv\") pod \"e093be35-bb62-4843-b2e8-094545761610\" (UID: \"e093be35-bb62-4843-b2e8-094545761610\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888106 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e9b5059-1b3e-4067-a63d-2952cbe863af-trusted-ca\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888125 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f65c0ac1-8bca-454d-a2e6-e35cb418beac-serving-cert\") pod \"f65c0ac1-8bca-454d-a2e6-e35cb418beac\" (UID: \"f65c0ac1-8bca-454d-a2e6-e35cb418beac\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888142 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09cfa50b-4138-4585-a53e-64dd3ab73335-serving-cert\") pod \"09cfa50b-4138-4585-a53e-64dd3ab73335\" (UID: \"09cfa50b-4138-4585-a53e-64dd3ab73335\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888160 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovnkube-script-lib\") pod \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\" (UID: \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888195 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-error\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888191 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888216 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g4lr\" (UniqueName: \"kubernetes.io/projected/f7e2c886-118e-43bb-bef1-c78134de392b-kube-api-access-6g4lr\") pod \"f7e2c886-118e-43bb-bef1-c78134de392b\" (UID: \"f7e2c886-118e-43bb-bef1-c78134de392b\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888224 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f71a554-e414-4bc3-96d2-674060397afe-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "9f71a554-e414-4bc3-96d2-674060397afe" (UID: "9f71a554-e414-4bc3-96d2-674060397afe"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888232 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16bdd140-dce1-464c-ab47-dd5798d1d256-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "16bdd140-dce1-464c-ab47-dd5798d1d256" (UID: "16bdd140-dce1-464c-ab47-dd5798d1d256"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888247 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jjkz\" (UniqueName: \"kubernetes.io/projected/301e1965-1754-483d-b6cc-bfae7038bbca-kube-api-access-7jjkz\") pod \"301e1965-1754-483d-b6cc-bfae7038bbca\" (UID: \"301e1965-1754-483d-b6cc-bfae7038bbca\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888361 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/149b3c48-e17c-4a66-a835-d86dabf6ff13-catalog-content\") pod \"149b3c48-e17c-4a66-a835-d86dabf6ff13\" (UID: \"149b3c48-e17c-4a66-a835-d86dabf6ff13\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888400 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w94wk\" (UniqueName: \"kubernetes.io/projected/01080b46-74f1-4191-8755-5152a57b3b25-kube-api-access-w94wk\") pod \"01080b46-74f1-4191-8755-5152a57b3b25\" (UID: \"01080b46-74f1-4191-8755-5152a57b3b25\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888436 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/736c54fe-349c-4bb9-870a-d1c1d1c03831-tmp\") pod \"736c54fe-349c-4bb9-870a-d1c1d1c03831\" (UID: \"736c54fe-349c-4bb9-870a-d1c1d1c03831\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888464 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzt4w\" (UniqueName: \"kubernetes.io/projected/a52afe44-fb37-46ed-a1f8-bf39727a3cbe-kube-api-access-rzt4w\") pod \"a52afe44-fb37-46ed-a1f8-bf39727a3cbe\" (UID: \"a52afe44-fb37-46ed-a1f8-bf39727a3cbe\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888495 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c491984c-7d4b-44aa-8c1e-d7974424fa47-machine-api-operator-tls\") pod \"c491984c-7d4b-44aa-8c1e-d7974424fa47\" (UID: \"c491984c-7d4b-44aa-8c1e-d7974424fa47\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888525 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-serving-cert\") pod \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\" (UID: \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888535 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/301e1965-1754-483d-b6cc-bfae7038bbca-kube-api-access-7jjkz" (OuterVolumeSpecName: "kube-api-access-7jjkz") pod "301e1965-1754-483d-b6cc-bfae7038bbca" (UID: "301e1965-1754-483d-b6cc-bfae7038bbca"). InnerVolumeSpecName "kube-api-access-7jjkz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888551 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5f2bfad-70f6-4185-a3d9-81ce12720767-config\") pod \"c5f2bfad-70f6-4185-a3d9-81ce12720767\" (UID: \"c5f2bfad-70f6-4185-a3d9-81ce12720767\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888582 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vsz9\" (UniqueName: \"kubernetes.io/projected/c491984c-7d4b-44aa-8c1e-d7974424fa47-kube-api-access-9vsz9\") pod \"c491984c-7d4b-44aa-8c1e-d7974424fa47\" (UID: \"c491984c-7d4b-44aa-8c1e-d7974424fa47\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888582 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-kube-api-access-tkdh6" (OuterVolumeSpecName: "kube-api-access-tkdh6") pod "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" (UID: "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e"). InnerVolumeSpecName "kube-api-access-tkdh6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888598 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81e39f7b-62e4-4fc9-992a-6535ce127a02-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "81e39f7b-62e4-4fc9-992a-6535ce127a02" (UID: "81e39f7b-62e4-4fc9-992a-6535ce127a02"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888612 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/42a11a02-47e1-488f-b270-2679d3298b0e-control-plane-machine-set-operator-tls\") pod \"42a11a02-47e1-488f-b270-2679d3298b0e\" (UID: \"42a11a02-47e1-488f-b270-2679d3298b0e\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888642 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-tmp\") pod \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\" (UID: \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889004 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f65c0ac1-8bca-454d-a2e6-e35cb418beac-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f65c0ac1-8bca-454d-a2e6-e35cb418beac" (UID: "f65c0ac1-8bca-454d-a2e6-e35cb418beac"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889023 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ee8fbd3-1f81-4666-96da-5afc70819f1a-samples-operator-tls\") pod \"6ee8fbd3-1f81-4666-96da-5afc70819f1a\" (UID: \"6ee8fbd3-1f81-4666-96da-5afc70819f1a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889059 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-oauth-config\") pod \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\" (UID: \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889091 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4tqq\" (UniqueName: \"kubernetes.io/projected/6ee8fbd3-1f81-4666-96da-5afc70819f1a-kube-api-access-d4tqq\") pod \"6ee8fbd3-1f81-4666-96da-5afc70819f1a\" (UID: \"6ee8fbd3-1f81-4666-96da-5afc70819f1a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889113 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7e8f42f-dc0e-424b-bb56-5ec849834888-serving-cert\") pod \"d7e8f42f-dc0e-424b-bb56-5ec849834888\" (UID: \"d7e8f42f-dc0e-424b-bb56-5ec849834888\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889138 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/567683bd-0efc-4f21-b076-e28559628404-serving-cert\") pod \"567683bd-0efc-4f21-b076-e28559628404\" (UID: \"567683bd-0efc-4f21-b076-e28559628404\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889161 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d565531a-ff86-4608-9d19-767de01ac31b-proxy-tls\") pod \"d565531a-ff86-4608-9d19-767de01ac31b\" (UID: \"d565531a-ff86-4608-9d19-767de01ac31b\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889196 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" (UID: "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889206 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7afa918d-be67-40a6-803c-d3b0ae99d815-config\") pod \"7afa918d-be67-40a6-803c-d3b0ae99d815\" (UID: \"7afa918d-be67-40a6-803c-d3b0ae99d815\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889261 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptkcf\" (UniqueName: \"kubernetes.io/projected/7599e0b6-bddf-4def-b7f2-0b32206e8651-kube-api-access-ptkcf\") pod \"7599e0b6-bddf-4def-b7f2-0b32206e8651\" (UID: \"7599e0b6-bddf-4def-b7f2-0b32206e8651\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889328 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-config\") pod \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\" (UID: \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889349 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnxbn\" (UniqueName: \"kubernetes.io/projected/ce090a97-9ab6-4c40-a719-64ff2acd9778-kube-api-access-xnxbn\") pod \"ce090a97-9ab6-4c40-a719-64ff2acd9778\" (UID: \"ce090a97-9ab6-4c40-a719-64ff2acd9778\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889369 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31fa8943-81cc-4750-a0b7-0fa9ab5af883-utilities\") pod \"31fa8943-81cc-4750-a0b7-0fa9ab5af883\" (UID: \"31fa8943-81cc-4750-a0b7-0fa9ab5af883\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889393 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-audit-policies\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889415 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-tmp\") pod \"b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a\" (UID: \"b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889443 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c491984c-7d4b-44aa-8c1e-d7974424fa47-config\") pod \"c491984c-7d4b-44aa-8c1e-d7974424fa47\" (UID: \"c491984c-7d4b-44aa-8c1e-d7974424fa47\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889472 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-login\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889521 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfp5s\" (UniqueName: \"kubernetes.io/projected/cc85e424-18b2-4924-920b-bd291a8c4b01-kube-api-access-xfp5s\") pod \"cc85e424-18b2-4924-920b-bd291a8c4b01\" (UID: \"cc85e424-18b2-4924-920b-bd291a8c4b01\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889543 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-catalog-content\") pod \"71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a\" (UID: \"71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889565 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/584e1f4a-8205-47d7-8efb-3afc6017c4c9-catalog-content\") pod \"584e1f4a-8205-47d7-8efb-3afc6017c4c9\" (UID: \"584e1f4a-8205-47d7-8efb-3afc6017c4c9\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889587 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7df94c10-441d-4386-93a6-6730fb7bcde0-env-overrides\") pod \"7df94c10-441d-4386-93a6-6730fb7bcde0\" (UID: \"7df94c10-441d-4386-93a6-6730fb7bcde0\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889607 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a52afe44-fb37-46ed-a1f8-bf39727a3cbe-cert\") pod \"a52afe44-fb37-46ed-a1f8-bf39727a3cbe\" (UID: \"a52afe44-fb37-46ed-a1f8-bf39727a3cbe\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889635 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9f71a554-e414-4bc3-96d2-674060397afe-metrics-tls\") pod \"9f71a554-e414-4bc3-96d2-674060397afe\" (UID: \"9f71a554-e414-4bc3-96d2-674060397afe\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889646 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889654 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a555ff2e-0be6-46d5-897d-863bb92ae2b3-tmp\") pod \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\" (UID: \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889507 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" (UID: "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889754 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfzkj\" (UniqueName: \"kubernetes.io/projected/0effdbcf-dd7d-404d-9d48-77536d665a5d-kube-api-access-mfzkj\") pod \"0effdbcf-dd7d-404d-9d48-77536d665a5d\" (UID: \"0effdbcf-dd7d-404d-9d48-77536d665a5d\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889789 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted-pem\" (UniqueName: \"kubernetes.io/empty-dir/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-ca-trust-extracted-pem\") pod \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\" (UID: \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889813 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-config\") pod \"d45be74c-0d98-4d18-90e4-f7ef1b6daaf7\" (UID: \"d45be74c-0d98-4d18-90e4-f7ef1b6daaf7\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889834 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94a6e063-3d1a-4d44-875d-185291448c31-utilities\") pod \"94a6e063-3d1a-4d44-875d-185291448c31\" (UID: \"94a6e063-3d1a-4d44-875d-185291448c31\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889831 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c491984c-7d4b-44aa-8c1e-d7974424fa47-images" (OuterVolumeSpecName: "images") pod "c491984c-7d4b-44aa-8c1e-d7974424fa47" (UID: "c491984c-7d4b-44aa-8c1e-d7974424fa47"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889858 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dztfv\" (UniqueName: \"kubernetes.io/projected/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-kube-api-access-dztfv\") pod \"d45be74c-0d98-4d18-90e4-f7ef1b6daaf7\" (UID: \"d45be74c-0d98-4d18-90e4-f7ef1b6daaf7\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889887 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-image-registry-operator-tls\") pod \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\" (UID: \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889908 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wj4qr\" (UniqueName: \"kubernetes.io/projected/149b3c48-e17c-4a66-a835-d86dabf6ff13-kube-api-access-wj4qr\") pod \"149b3c48-e17c-4a66-a835-d86dabf6ff13\" (UID: \"149b3c48-e17c-4a66-a835-d86dabf6ff13\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889930 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4-metrics-certs\") pod \"f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4\" (UID: \"f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889959 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5f2bfad-70f6-4185-a3d9-81ce12720767-kube-api-access\") pod \"c5f2bfad-70f6-4185-a3d9-81ce12720767\" (UID: \"c5f2bfad-70f6-4185-a3d9-81ce12720767\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889983 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a7a88189-c967-4640-879e-27665747f20c-tmpfs\") pod \"a7a88189-c967-4640-879e-27665747f20c\" (UID: \"a7a88189-c967-4640-879e-27665747f20c\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890009 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-machine-approver-tls\") pod \"fc8db2c7-859d-47b3-a900-2bd0c0b2973b\" (UID: \"fc8db2c7-859d-47b3-a900-2bd0c0b2973b\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890034 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18f80adb-c1c3-49ba-8ee4-932c851d3897-service-ca-bundle\") pod \"18f80adb-c1c3-49ba-8ee4-932c851d3897\" (UID: \"18f80adb-c1c3-49ba-8ee4-932c851d3897\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890060 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-etcd-serving-ca\") pod \"f559dfa3-3917-43a2-97f6-61ddfda10e93\" (UID: \"f559dfa3-3917-43a2-97f6-61ddfda10e93\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890069 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a555ff2e-0be6-46d5-897d-863bb92ae2b3-tmp" (OuterVolumeSpecName: "tmp") pod "a555ff2e-0be6-46d5-897d-863bb92ae2b3" (UID: "a555ff2e-0be6-46d5-897d-863bb92ae2b3"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890089 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsb9b\" (UniqueName: \"kubernetes.io/projected/09cfa50b-4138-4585-a53e-64dd3ab73335-kube-api-access-zsb9b\") pod \"09cfa50b-4138-4585-a53e-64dd3ab73335\" (UID: \"09cfa50b-4138-4585-a53e-64dd3ab73335\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890105 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7afa918d-be67-40a6-803c-d3b0ae99d815-config" (OuterVolumeSpecName: "config") pod "7afa918d-be67-40a6-803c-d3b0ae99d815" (UID: "7afa918d-be67-40a6-803c-d3b0ae99d815"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890122 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-serving-cert\") pod \"f559dfa3-3917-43a2-97f6-61ddfda10e93\" (UID: \"f559dfa3-3917-43a2-97f6-61ddfda10e93\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890127 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-config" (OuterVolumeSpecName: "config") pod "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" (UID: "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890154 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/736c54fe-349c-4bb9-870a-d1c1d1c03831-client-ca\") pod \"736c54fe-349c-4bb9-870a-d1c1d1c03831\" (UID: \"736c54fe-349c-4bb9-870a-d1c1d1c03831\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890202 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f65c0ac1-8bca-454d-a2e6-e35cb418beac-config\") pod \"f65c0ac1-8bca-454d-a2e6-e35cb418beac\" (UID: \"f65c0ac1-8bca-454d-a2e6-e35cb418beac\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890205 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "af33e427-6803-48c2-a76a-dd9deb7cbf9a" (UID: "af33e427-6803-48c2-a76a-dd9deb7cbf9a"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890228 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-registry-tls\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890261 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-utilities\") pod \"9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff\" (UID: \"9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890289 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d7e8f42f-dc0e-424b-bb56-5ec849834888-kube-api-access\") pod \"d7e8f42f-dc0e-424b-bb56-5ec849834888\" (UID: \"d7e8f42f-dc0e-424b-bb56-5ec849834888\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890312 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ce090a97-9ab6-4c40-a719-64ff2acd9778-signing-cabundle\") pod \"ce090a97-9ab6-4c40-a719-64ff2acd9778\" (UID: \"ce090a97-9ab6-4c40-a719-64ff2acd9778\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890333 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9f71a554-e414-4bc3-96d2-674060397afe-trusted-ca\") pod \"9f71a554-e414-4bc3-96d2-674060397afe\" (UID: \"9f71a554-e414-4bc3-96d2-674060397afe\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890356 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-config\") pod \"567683bd-0efc-4f21-b076-e28559628404\" (UID: \"567683bd-0efc-4f21-b076-e28559628404\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890380 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-marketplace-operator-metrics\") pod \"b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a\" (UID: \"b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890412 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-trusted-ca-bundle\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890429 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/736c54fe-349c-4bb9-870a-d1c1d1c03831-kube-api-access-6dmhf" (OuterVolumeSpecName: "kube-api-access-6dmhf") pod "736c54fe-349c-4bb9-870a-d1c1d1c03831" (UID: "736c54fe-349c-4bb9-870a-d1c1d1c03831"). InnerVolumeSpecName "kube-api-access-6dmhf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890437 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-etcd-service-ca\") pod \"567683bd-0efc-4f21-b076-e28559628404\" (UID: \"567683bd-0efc-4f21-b076-e28559628404\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890452 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7599e0b6-bddf-4def-b7f2-0b32206e8651-kube-api-access-ptkcf" (OuterVolumeSpecName: "kube-api-access-ptkcf") pod "7599e0b6-bddf-4def-b7f2-0b32206e8651" (UID: "7599e0b6-bddf-4def-b7f2-0b32206e8651"). InnerVolumeSpecName "kube-api-access-ptkcf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890510 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-client-ca\") pod \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\" (UID: \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890537 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nb9c\" (UniqueName: \"kubernetes.io/projected/6edfcf45-925b-4eff-b940-95b6fc0b85d4-kube-api-access-8nb9c\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890557 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m26jq\" (UniqueName: \"kubernetes.io/projected/567683bd-0efc-4f21-b076-e28559628404-kube-api-access-m26jq\") pod \"567683bd-0efc-4f21-b076-e28559628404\" (UID: \"567683bd-0efc-4f21-b076-e28559628404\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890574 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovnkube-config\") pod \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\" (UID: \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890601 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pskd\" (UniqueName: \"kubernetes.io/projected/a555ff2e-0be6-46d5-897d-863bb92ae2b3-kube-api-access-8pskd\") pod \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\" (UID: \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890628 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grwfz\" (UniqueName: \"kubernetes.io/projected/31fa8943-81cc-4750-a0b7-0fa9ab5af883-kube-api-access-grwfz\") pod \"31fa8943-81cc-4750-a0b7-0fa9ab5af883\" (UID: \"31fa8943-81cc-4750-a0b7-0fa9ab5af883\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890654 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddlk9\" (UniqueName: \"kubernetes.io/projected/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-kube-api-access-ddlk9\") pod \"7fcc6409-8a0f-44c3-89e7-5aecd7610f8a\" (UID: \"7fcc6409-8a0f-44c3-89e7-5aecd7610f8a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890673 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7cps\" (UniqueName: \"kubernetes.io/projected/af41de71-79cf-4590-bbe9-9e8b848862cb-kube-api-access-d7cps\") pod \"af41de71-79cf-4590-bbe9-9e8b848862cb\" (UID: \"af41de71-79cf-4590-bbe9-9e8b848862cb\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890696 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws8zz\" (UniqueName: \"kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-kube-api-access-ws8zz\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890720 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a7a88189-c967-4640-879e-27665747f20c-webhook-cert\") pod \"a7a88189-c967-4640-879e-27665747f20c\" (UID: \"a7a88189-c967-4640-879e-27665747f20c\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890741 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/736c54fe-349c-4bb9-870a-d1c1d1c03831-serving-cert\") pod \"736c54fe-349c-4bb9-870a-d1c1d1c03831\" (UID: \"736c54fe-349c-4bb9-870a-d1c1d1c03831\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890761 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7df94c10-441d-4386-93a6-6730fb7bcde0-ovn-control-plane-metrics-cert\") pod \"7df94c10-441d-4386-93a6-6730fb7bcde0\" (UID: \"7df94c10-441d-4386-93a6-6730fb7bcde0\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890782 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-ocp-branding-template\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890799 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/92dfbade-90b6-4169-8c07-72cff7f2c82b-tmp-dir\") pod \"92dfbade-90b6-4169-8c07-72cff7f2c82b\" (UID: \"92dfbade-90b6-4169-8c07-72cff7f2c82b\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890816 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-idp-0-file-data\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890840 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftwb6\" (UniqueName: \"kubernetes.io/projected/9f71a554-e414-4bc3-96d2-674060397afe-kube-api-access-ftwb6\") pod \"9f71a554-e414-4bc3-96d2-674060397afe\" (UID: \"9f71a554-e414-4bc3-96d2-674060397afe\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890861 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/16bdd140-dce1-464c-ab47-dd5798d1d256-available-featuregates\") pod \"16bdd140-dce1-464c-ab47-dd5798d1d256\" (UID: \"16bdd140-dce1-464c-ab47-dd5798d1d256\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890880 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b605f283-6f2e-42da-a838-54421690f7d0-utilities\") pod \"b605f283-6f2e-42da-a838-54421690f7d0\" (UID: \"b605f283-6f2e-42da-a838-54421690f7d0\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890897 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/81e39f7b-62e4-4fc9-992a-6535ce127a02-cni-binary-copy\") pod \"81e39f7b-62e4-4fc9-992a-6535ce127a02\" (UID: \"81e39f7b-62e4-4fc9-992a-6535ce127a02\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890916 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a208c9c2-333b-4b4a-be0d-bc32ec38a821-package-server-manager-serving-cert\") pod \"a208c9c2-333b-4b4a-be0d-bc32ec38a821\" (UID: \"a208c9c2-333b-4b4a-be0d-bc32ec38a821\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890933 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7df94c10-441d-4386-93a6-6730fb7bcde0-ovnkube-config\") pod \"7df94c10-441d-4386-93a6-6730fb7bcde0\" (UID: \"7df94c10-441d-4386-93a6-6730fb7bcde0\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890952 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-bound-sa-token\") pod \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\" (UID: \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890969 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqbfk\" (UniqueName: \"kubernetes.io/projected/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-kube-api-access-qqbfk\") pod \"b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a\" (UID: \"b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890992 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-metrics-certs\") pod \"18f80adb-c1c3-49ba-8ee4-932c851d3897\" (UID: \"18f80adb-c1c3-49ba-8ee4-932c851d3897\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891010 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01080b46-74f1-4191-8755-5152a57b3b25-serving-cert\") pod \"01080b46-74f1-4191-8755-5152a57b3b25\" (UID: \"01080b46-74f1-4191-8755-5152a57b3b25\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891034 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/593a3561-7760-45c5-8f91-5aaef7475d0f-certs\") pod \"593a3561-7760-45c5-8f91-5aaef7475d0f\" (UID: \"593a3561-7760-45c5-8f91-5aaef7475d0f\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891052 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5ebfebf6-3ecd-458e-943f-bb25b52e2718-serviceca\") pod \"5ebfebf6-3ecd-458e-943f-bb25b52e2718\" (UID: \"5ebfebf6-3ecd-458e-943f-bb25b52e2718\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891073 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-cliconfig\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891091 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmmzf\" (UniqueName: \"kubernetes.io/projected/7df94c10-441d-4386-93a6-6730fb7bcde0-kube-api-access-nmmzf\") pod \"7df94c10-441d-4386-93a6-6730fb7bcde0\" (UID: \"7df94c10-441d-4386-93a6-6730fb7bcde0\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891109 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgrkj\" (UniqueName: \"kubernetes.io/projected/42a11a02-47e1-488f-b270-2679d3298b0e-kube-api-access-qgrkj\") pod \"42a11a02-47e1-488f-b270-2679d3298b0e\" (UID: \"42a11a02-47e1-488f-b270-2679d3298b0e\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891127 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-session\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891145 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99zj9\" (UniqueName: \"kubernetes.io/projected/d565531a-ff86-4608-9d19-767de01ac31b-kube-api-access-99zj9\") pod \"d565531a-ff86-4608-9d19-767de01ac31b\" (UID: \"d565531a-ff86-4608-9d19-767de01ac31b\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891188 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g8ts\" (UniqueName: \"kubernetes.io/projected/92dfbade-90b6-4169-8c07-72cff7f2c82b-kube-api-access-4g8ts\") pod \"92dfbade-90b6-4169-8c07-72cff7f2c82b\" (UID: \"92dfbade-90b6-4169-8c07-72cff7f2c82b\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891210 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-stats-auth\") pod \"18f80adb-c1c3-49ba-8ee4-932c851d3897\" (UID: \"18f80adb-c1c3-49ba-8ee4-932c851d3897\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891227 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-image-import-ca\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891246 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twvbl\" (UniqueName: \"kubernetes.io/projected/b4750666-1362-4001-abd0-6f89964cc621-kube-api-access-twvbl\") pod \"b4750666-1362-4001-abd0-6f89964cc621\" (UID: \"b4750666-1362-4001-abd0-6f89964cc621\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891262 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92dfbade-90b6-4169-8c07-72cff7f2c82b-config-volume\") pod \"92dfbade-90b6-4169-8c07-72cff7f2c82b\" (UID: \"92dfbade-90b6-4169-8c07-72cff7f2c82b\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891282 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbc2l\" (UniqueName: \"kubernetes.io/projected/593a3561-7760-45c5-8f91-5aaef7475d0f-kube-api-access-sbc2l\") pod \"593a3561-7760-45c5-8f91-5aaef7475d0f\" (UID: \"593a3561-7760-45c5-8f91-5aaef7475d0f\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891303 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-cni-binary-copy\") pod \"869851b9-7ffb-4af0-b166-1d8aa40a5f80\" (UID: \"869851b9-7ffb-4af0-b166-1d8aa40a5f80\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891325 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0dd0fbac-8c0d-4228-8faa-abbeedabf7db-webhook-certs\") pod \"0dd0fbac-8c0d-4228-8faa-abbeedabf7db\" (UID: \"0dd0fbac-8c0d-4228-8faa-abbeedabf7db\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891346 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f7e2c886-118e-43bb-bef1-c78134de392b-tmp-dir\") pod \"f7e2c886-118e-43bb-bef1-c78134de392b\" (UID: \"f7e2c886-118e-43bb-bef1-c78134de392b\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891371 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-serving-cert\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.886911 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b605f283-6f2e-42da-a838-54421690f7d0-kube-api-access-6rmnv" (OuterVolumeSpecName: "kube-api-access-6rmnv") pod "b605f283-6f2e-42da-a838-54421690f7d0" (UID: "b605f283-6f2e-42da-a838-54421690f7d0"). InnerVolumeSpecName "kube-api-access-6rmnv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891394 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94a6e063-3d1a-4d44-875d-185291448c31-catalog-content\") pod \"94a6e063-3d1a-4d44-875d-185291448c31\" (UID: \"94a6e063-3d1a-4d44-875d-185291448c31\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890631 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7a88189-c967-4640-879e-27665747f20c-kube-api-access-8nspp" (OuterVolumeSpecName: "kube-api-access-8nspp") pod "a7a88189-c967-4640-879e-27665747f20c" (UID: "a7a88189-c967-4640-879e-27665747f20c"). InnerVolumeSpecName "kube-api-access-8nspp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890706 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09cfa50b-4138-4585-a53e-64dd3ab73335-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09cfa50b-4138-4585-a53e-64dd3ab73335" (UID: "09cfa50b-4138-4585-a53e-64dd3ab73335"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891422 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d7e8f42f-dc0e-424b-bb56-5ec849834888-service-ca\") pod \"d7e8f42f-dc0e-424b-bb56-5ec849834888\" (UID: \"d7e8f42f-dc0e-424b-bb56-5ec849834888\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891445 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-profile-collector-cert\") pod \"7fcc6409-8a0f-44c3-89e7-5aecd7610f8a\" (UID: \"7fcc6409-8a0f-44c3-89e7-5aecd7610f8a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892210 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7afa918d-be67-40a6-803c-d3b0ae99d815-kube-api-access\") pod \"7afa918d-be67-40a6-803c-d3b0ae99d815\" (UID: \"7afa918d-be67-40a6-803c-d3b0ae99d815\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892264 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-trusted-ca-bundle\") pod \"f559dfa3-3917-43a2-97f6-61ddfda10e93\" (UID: \"f559dfa3-3917-43a2-97f6-61ddfda10e93\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892295 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-default-certificate\") pod \"18f80adb-c1c3-49ba-8ee4-932c851d3897\" (UID: \"18f80adb-c1c3-49ba-8ee4-932c851d3897\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892348 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lcfw\" (UniqueName: \"kubernetes.io/projected/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-kube-api-access-5lcfw\") pod \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\" (UID: \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892372 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-trusted-ca-bundle\") pod \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\" (UID: \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892400 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-service-ca\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892426 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-srv-cert\") pod \"7fcc6409-8a0f-44c3-89e7-5aecd7610f8a\" (UID: \"7fcc6409-8a0f-44c3-89e7-5aecd7610f8a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892455 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-mcd-auth-proxy-config\") pod \"e1d2a42d-af1d-4054-9618-ab545e0ed8b7\" (UID: \"e1d2a42d-af1d-4054-9618-ab545e0ed8b7\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892481 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hckvg\" (UniqueName: \"kubernetes.io/projected/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-kube-api-access-hckvg\") pod \"fc8db2c7-859d-47b3-a900-2bd0c0b2973b\" (UID: \"fc8db2c7-859d-47b3-a900-2bd0c0b2973b\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892507 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2325ffef-9d5b-447f-b00e-3efc429acefe-serving-cert\") pod \"2325ffef-9d5b-447f-b00e-3efc429acefe\" (UID: \"2325ffef-9d5b-447f-b00e-3efc429acefe\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892532 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9e9b5059-1b3e-4067-a63d-2952cbe863af-ca-trust-extracted\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892554 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5lgh\" (UniqueName: \"kubernetes.io/projected/d19cb085-0c5b-4810-b654-ce7923221d90-kube-api-access-m5lgh\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892580 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zth6t\" (UniqueName: \"kubernetes.io/projected/6077b63e-53a2-4f96-9d56-1ce0324e4913-kube-api-access-zth6t\") pod \"6077b63e-53a2-4f96-9d56-1ce0324e4913\" (UID: \"6077b63e-53a2-4f96-9d56-1ce0324e4913\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892607 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-config\") pod \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\" (UID: \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892638 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-oauth-serving-cert\") pod \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\" (UID: \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892665 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pllx6\" (UniqueName: \"kubernetes.io/projected/81e39f7b-62e4-4fc9-992a-6535ce127a02-kube-api-access-pllx6\") pod \"81e39f7b-62e4-4fc9-992a-6535ce127a02\" (UID: \"81e39f7b-62e4-4fc9-992a-6535ce127a02\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892687 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-proxy-tls\") pod \"e1d2a42d-af1d-4054-9618-ab545e0ed8b7\" (UID: \"e1d2a42d-af1d-4054-9618-ab545e0ed8b7\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892713 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/567683bd-0efc-4f21-b076-e28559628404-tmp-dir\") pod \"567683bd-0efc-4f21-b076-e28559628404\" (UID: \"567683bd-0efc-4f21-b076-e28559628404\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892737 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31fa8943-81cc-4750-a0b7-0fa9ab5af883-catalog-content\") pod \"31fa8943-81cc-4750-a0b7-0fa9ab5af883\" (UID: \"31fa8943-81cc-4750-a0b7-0fa9ab5af883\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892760 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc85e424-18b2-4924-920b-bd291a8c4b01-catalog-content\") pod \"cc85e424-18b2-4924-920b-bd291a8c4b01\" (UID: \"cc85e424-18b2-4924-920b-bd291a8c4b01\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892788 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d565531a-ff86-4608-9d19-767de01ac31b-images\") pod \"d565531a-ff86-4608-9d19-767de01ac31b\" (UID: \"d565531a-ff86-4608-9d19-767de01ac31b\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892811 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zg8nc\" (UniqueName: \"kubernetes.io/projected/2325ffef-9d5b-447f-b00e-3efc429acefe-kube-api-access-zg8nc\") pod \"2325ffef-9d5b-447f-b00e-3efc429acefe\" (UID: \"2325ffef-9d5b-447f-b00e-3efc429acefe\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892837 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7599e0b6-bddf-4def-b7f2-0b32206e8651-serving-cert\") pod \"7599e0b6-bddf-4def-b7f2-0b32206e8651\" (UID: \"7599e0b6-bddf-4def-b7f2-0b32206e8651\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892862 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f65c0ac1-8bca-454d-a2e6-e35cb418beac-kube-api-access\") pod \"f65c0ac1-8bca-454d-a2e6-e35cb418beac\" (UID: \"f65c0ac1-8bca-454d-a2e6-e35cb418beac\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892885 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f65c0ac1-8bca-454d-a2e6-e35cb418beac-tmp-dir\") pod \"f65c0ac1-8bca-454d-a2e6-e35cb418beac\" (UID: \"f65c0ac1-8bca-454d-a2e6-e35cb418beac\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892909 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-catalog-content\") pod \"9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff\" (UID: \"9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892930 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/584e1f4a-8205-47d7-8efb-3afc6017c4c9-utilities\") pod \"584e1f4a-8205-47d7-8efb-3afc6017c4c9\" (UID: \"584e1f4a-8205-47d7-8efb-3afc6017c4c9\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892954 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/567683bd-0efc-4f21-b076-e28559628404-etcd-client\") pod \"567683bd-0efc-4f21-b076-e28559628404\" (UID: \"567683bd-0efc-4f21-b076-e28559628404\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892978 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-encryption-config\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892999 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6077b63e-53a2-4f96-9d56-1ce0324e4913-metrics-tls\") pod \"6077b63e-53a2-4f96-9d56-1ce0324e4913\" (UID: \"6077b63e-53a2-4f96-9d56-1ce0324e4913\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893026 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7599e0b6-bddf-4def-b7f2-0b32206e8651-config\") pod \"7599e0b6-bddf-4def-b7f2-0b32206e8651\" (UID: \"7599e0b6-bddf-4def-b7f2-0b32206e8651\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893054 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a555ff2e-0be6-46d5-897d-863bb92ae2b3-serving-cert\") pod \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\" (UID: \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893080 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4smf\" (UniqueName: \"kubernetes.io/projected/0dd0fbac-8c0d-4228-8faa-abbeedabf7db-kube-api-access-q4smf\") pod \"0dd0fbac-8c0d-4228-8faa-abbeedabf7db\" (UID: \"0dd0fbac-8c0d-4228-8faa-abbeedabf7db\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893104 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ce090a97-9ab6-4c40-a719-64ff2acd9778-signing-key\") pod \"ce090a97-9ab6-4c40-a719-64ff2acd9778\" (UID: \"ce090a97-9ab6-4c40-a719-64ff2acd9778\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893062 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe8361c-9ce7-48cd-9142-ae635e1b27d2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://5f2712849d12be66837cdb73e0858adfc9b9172ec8bbe1d83c7fe3fcc4bc8fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:51Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://8c2b67442f55addc55a7f281f1b6a6441e1d6068e6f826dedb686cdc29ef2ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://609c703ea4eb165b2fdc69a88571190199d2dd374ea3629e50d7e6091c4552b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://1543246d672acb399d96f28680b91e7dcf741cbb8ce21363dd09dec6dad2687c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1543246d672acb399d96f28680b91e7dcf741cbb8ce21363dd09dec6dad2687c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:50Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:08:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893126 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-marketplace-trusted-ca\") pod \"b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a\" (UID: \"b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893430 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hb7m\" (UniqueName: \"kubernetes.io/projected/94a6e063-3d1a-4d44-875d-185291448c31-kube-api-access-4hb7m\") pod \"94a6e063-3d1a-4d44-875d-185291448c31\" (UID: \"94a6e063-3d1a-4d44-875d-185291448c31\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893460 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2325ffef-9d5b-447f-b00e-3efc429acefe-trusted-ca\") pod \"2325ffef-9d5b-447f-b00e-3efc429acefe\" (UID: \"2325ffef-9d5b-447f-b00e-3efc429acefe\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893488 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6077b63e-53a2-4f96-9d56-1ce0324e4913-tmp-dir\") pod \"6077b63e-53a2-4f96-9d56-1ce0324e4913\" (UID: \"6077b63e-53a2-4f96-9d56-1ce0324e4913\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893511 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-serving-cert\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893532 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b4750666-1362-4001-abd0-6f89964cc621-mcc-auth-proxy-config\") pod \"b4750666-1362-4001-abd0-6f89964cc621\" (UID: \"b4750666-1362-4001-abd0-6f89964cc621\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893554 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5f2bfad-70f6-4185-a3d9-81ce12720767-serving-cert\") pod \"c5f2bfad-70f6-4185-a3d9-81ce12720767\" (UID: \"c5f2bfad-70f6-4185-a3d9-81ce12720767\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893574 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxfcv\" (UniqueName: \"kubernetes.io/projected/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-kube-api-access-xxfcv\") pod \"9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff\" (UID: \"9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893592 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a7a88189-c967-4640-879e-27665747f20c-apiservice-cert\") pod \"a7a88189-c967-4640-879e-27665747f20c\" (UID: \"a7a88189-c967-4640-879e-27665747f20c\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893610 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/301e1965-1754-483d-b6cc-bfae7038bbca-tmpfs\") pod \"301e1965-1754-483d-b6cc-bfae7038bbca\" (UID: \"301e1965-1754-483d-b6cc-bfae7038bbca\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893630 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9z4sw\" (UniqueName: \"kubernetes.io/projected/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-kube-api-access-9z4sw\") pod \"e1d2a42d-af1d-4054-9618-ab545e0ed8b7\" (UID: \"e1d2a42d-af1d-4054-9618-ab545e0ed8b7\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893649 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-serving-cert\") pod \"d45be74c-0d98-4d18-90e4-f7ef1b6daaf7\" (UID: \"d45be74c-0d98-4d18-90e4-f7ef1b6daaf7\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893670 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l87hs\" (UniqueName: \"kubernetes.io/projected/5ebfebf6-3ecd-458e-943f-bb25b52e2718-kube-api-access-l87hs\") pod \"5ebfebf6-3ecd-458e-943f-bb25b52e2718\" (UID: \"5ebfebf6-3ecd-458e-943f-bb25b52e2718\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893690 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-auth-proxy-config\") pod \"fc8db2c7-859d-47b3-a900-2bd0c0b2973b\" (UID: \"fc8db2c7-859d-47b3-a900-2bd0c0b2973b\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893712 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hm9x7\" (UniqueName: \"kubernetes.io/projected/f559dfa3-3917-43a2-97f6-61ddfda10e93-kube-api-access-hm9x7\") pod \"f559dfa3-3917-43a2-97f6-61ddfda10e93\" (UID: \"f559dfa3-3917-43a2-97f6-61ddfda10e93\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893731 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc85e424-18b2-4924-920b-bd291a8c4b01-utilities\") pod \"cc85e424-18b2-4924-920b-bd291a8c4b01\" (UID: \"cc85e424-18b2-4924-920b-bd291a8c4b01\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893754 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-service-ca\") pod \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\" (UID: \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893784 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4750666-1362-4001-abd0-6f89964cc621-proxy-tls\") pod \"b4750666-1362-4001-abd0-6f89964cc621\" (UID: \"b4750666-1362-4001-abd0-6f89964cc621\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893815 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94l9h\" (UniqueName: \"kubernetes.io/projected/16bdd140-dce1-464c-ab47-dd5798d1d256-kube-api-access-94l9h\") pod \"16bdd140-dce1-464c-ab47-dd5798d1d256\" (UID: \"16bdd140-dce1-464c-ab47-dd5798d1d256\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893843 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-proxy-ca-bundles\") pod \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\" (UID: \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893938 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893970 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-audit\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893997 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-etcd-client\") pod \"f559dfa3-3917-43a2-97f6-61ddfda10e93\" (UID: \"f559dfa3-3917-43a2-97f6-61ddfda10e93\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894027 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovn-node-metrics-cert\") pod \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\" (UID: \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894056 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tknt7\" (UniqueName: \"kubernetes.io/projected/584e1f4a-8205-47d7-8efb-3afc6017c4c9-kube-api-access-tknt7\") pod \"584e1f4a-8205-47d7-8efb-3afc6017c4c9\" (UID: \"584e1f4a-8205-47d7-8efb-3afc6017c4c9\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894083 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-serving-cert\") pod \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\" (UID: \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894113 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjwtd\" (UniqueName: \"kubernetes.io/projected/869851b9-7ffb-4af0-b166-1d8aa40a5f80-kube-api-access-mjwtd\") pod \"869851b9-7ffb-4af0-b166-1d8aa40a5f80\" (UID: \"869851b9-7ffb-4af0-b166-1d8aa40a5f80\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894140 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2325ffef-9d5b-447f-b00e-3efc429acefe-config\") pod \"2325ffef-9d5b-447f-b00e-3efc429acefe\" (UID: \"2325ffef-9d5b-447f-b00e-3efc429acefe\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894192 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgx6b\" (UniqueName: \"kubernetes.io/projected/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4-kube-api-access-pgx6b\") pod \"f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4\" (UID: \"f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894223 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-env-overrides\") pod \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\" (UID: \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894256 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbmqg\" (UniqueName: \"kubernetes.io/projected/18f80adb-c1c3-49ba-8ee4-932c851d3897-kube-api-access-wbmqg\") pod \"18f80adb-c1c3-49ba-8ee4-932c851d3897\" (UID: \"18f80adb-c1c3-49ba-8ee4-932c851d3897\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894290 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-trusted-ca\") pod \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\" (UID: \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894326 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/593a3561-7760-45c5-8f91-5aaef7475d0f-node-bootstrap-token\") pod \"593a3561-7760-45c5-8f91-5aaef7475d0f\" (UID: \"593a3561-7760-45c5-8f91-5aaef7475d0f\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894367 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-etcd-ca\") pod \"567683bd-0efc-4f21-b076-e28559628404\" (UID: \"567683bd-0efc-4f21-b076-e28559628404\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894394 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/736c54fe-349c-4bb9-870a-d1c1d1c03831-config\") pod \"736c54fe-349c-4bb9-870a-d1c1d1c03831\" (UID: \"736c54fe-349c-4bb9-870a-d1c1d1c03831\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894428 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01080b46-74f1-4191-8755-5152a57b3b25-config\") pod \"01080b46-74f1-4191-8755-5152a57b3b25\" (UID: \"01080b46-74f1-4191-8755-5152a57b3b25\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.896566 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5rsr\" (UniqueName: \"kubernetes.io/projected/af33e427-6803-48c2-a76a-dd9deb7cbf9a-kube-api-access-z5rsr\") pod \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\" (UID: \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.896635 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26xrl\" (UniqueName: \"kubernetes.io/projected/a208c9c2-333b-4b4a-be0d-bc32ec38a821-kube-api-access-26xrl\") pod \"a208c9c2-333b-4b4a-be0d-bc32ec38a821\" (UID: \"a208c9c2-333b-4b4a-be0d-bc32ec38a821\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.896671 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-provider-selection\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.896720 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-utilities\") pod \"71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a\" (UID: \"71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.896756 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-etcd-client\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.896805 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d565531a-ff86-4608-9d19-767de01ac31b-auth-proxy-config\") pod \"d565531a-ff86-4608-9d19-767de01ac31b\" (UID: \"d565531a-ff86-4608-9d19-767de01ac31b\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.896857 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-tmp\") pod \"d45be74c-0d98-4d18-90e4-f7ef1b6daaf7\" (UID: \"d45be74c-0d98-4d18-90e4-f7ef1b6daaf7\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.896890 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-whereabouts-flatfile-configmap\") pod \"869851b9-7ffb-4af0-b166-1d8aa40a5f80\" (UID: \"869851b9-7ffb-4af0-b166-1d8aa40a5f80\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.896929 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-etcd-serving-ca\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.896965 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9e9b5059-1b3e-4067-a63d-2952cbe863af-installation-pull-secrets\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.896999 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-cni-sysctl-allowlist\") pod \"869851b9-7ffb-4af0-b166-1d8aa40a5f80\" (UID: \"869851b9-7ffb-4af0-b166-1d8aa40a5f80\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897031 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-config\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897062 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-audit-policies\") pod \"f559dfa3-3917-43a2-97f6-61ddfda10e93\" (UID: \"f559dfa3-3917-43a2-97f6-61ddfda10e93\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897105 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09cfa50b-4138-4585-a53e-64dd3ab73335-config\") pod \"09cfa50b-4138-4585-a53e-64dd3ab73335\" (UID: \"09cfa50b-4138-4585-a53e-64dd3ab73335\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897246 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d36c245b-3d7f-48eb-848e-c54198ae38a4-hosts-file\") pod \"node-resolver-nwnjb\" (UID: \"d36c245b-3d7f-48eb-848e-c54198ae38a4\") " pod="openshift-dns/node-resolver-nwnjb" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897287 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-os-release\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897316 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-etc-kubernetes\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897348 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-var-lib-kubelet\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897376 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3-rootfs\") pod \"machine-config-daemon-66g6d\" (UID: \"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3\") " pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897411 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qscfn\" (UniqueName: \"kubernetes.io/projected/1811891e-33d0-4500-a481-0e4aa2d3e95c-kube-api-access-qscfn\") pod \"node-ca-2rwjp\" (UID: \"1811891e-33d0-4500-a481-0e4aa2d3e95c\") " pod="openshift-image-registry/node-ca-2rwjp" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897439 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovnkube-config\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897480 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x5zcr\" (UniqueName: \"kubernetes.io/projected/d36c245b-3d7f-48eb-848e-c54198ae38a4-kube-api-access-x5zcr\") pod \"node-resolver-nwnjb\" (UID: \"d36c245b-3d7f-48eb-848e-c54198ae38a4\") " pod="openshift-dns/node-resolver-nwnjb" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897511 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3-mcd-auth-proxy-config\") pod \"machine-config-daemon-66g6d\" (UID: \"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3\") " pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897538 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e17ab744-68a7-4a24-8ef2-556696d752fb-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57b78d8988-bd7p4\" (UID: \"e17ab744-68a7-4a24-8ef2-556696d752fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897568 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntdv4\" (UniqueName: \"kubernetes.io/projected/e17ab744-68a7-4a24-8ef2-556696d752fb-kube-api-access-ntdv4\") pod \"ovnkube-control-plane-57b78d8988-bd7p4\" (UID: \"e17ab744-68a7-4a24-8ef2-556696d752fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897746 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d36c245b-3d7f-48eb-848e-c54198ae38a4-tmp-dir\") pod \"node-resolver-nwnjb\" (UID: \"d36c245b-3d7f-48eb-848e-c54198ae38a4\") " pod="openshift-dns/node-resolver-nwnjb" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897782 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-cnibin\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897816 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-multus-socket-dir-parent\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897855 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1811891e-33d0-4500-a481-0e4aa2d3e95c-serviceca\") pod \"node-ca-2rwjp\" (UID: \"1811891e-33d0-4500-a481-0e4aa2d3e95c\") " pod="openshift-image-registry/node-ca-2rwjp" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897885 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-run-netns\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897967 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-run-k8s-cni-cncf-io\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898006 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-run-netns\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898034 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3-proxy-tls\") pod \"machine-config-daemon-66g6d\" (UID: \"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3\") " pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898079 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/68dcbc21-b4ce-4285-9a4b-101724f82f33-system-cni-dir\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898119 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/68dcbc21-b4ce-4285-9a4b-101724f82f33-cnibin\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898143 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1811891e-33d0-4500-a481-0e4aa2d3e95c-host\") pod \"node-ca-2rwjp\" (UID: \"1811891e-33d0-4500-a481-0e4aa2d3e95c\") " pod="openshift-image-registry/node-ca-2rwjp" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898206 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-log-socket\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898238 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-run-ovn-kubernetes\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898263 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-cni-bin\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898292 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-cni-netd\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898350 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-system-cni-dir\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898396 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7gh6\" (UniqueName: \"kubernetes.io/projected/68dcbc21-b4ce-4285-9a4b-101724f82f33-kube-api-access-b7gh6\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898427 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-openvswitch\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898490 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-var-lib-cni-multus\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898532 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5188f25b-37c3-46f1-b939-199c6e082848-multus-daemon-config\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898565 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e17ab744-68a7-4a24-8ef2-556696d752fb-env-overrides\") pod \"ovnkube-control-plane-57b78d8988-bd7p4\" (UID: \"e17ab744-68a7-4a24-8ef2-556696d752fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898592 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/68dcbc21-b4ce-4285-9a4b-101724f82f33-cni-binary-copy\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898630 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/68dcbc21-b4ce-4285-9a4b-101724f82f33-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898660 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-systemd\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898690 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-multus-conf-dir\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898730 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-multus-cni-dir\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898759 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-var-lib-cni-bin\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898785 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-hostroot\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898811 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e17ab744-68a7-4a24-8ef2-556696d752fb-ovnkube-config\") pod \"ovnkube-control-plane-57b78d8988-bd7p4\" (UID: \"e17ab744-68a7-4a24-8ef2-556696d752fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898841 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-run-multus-certs\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898883 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sqrqs\" (UniqueName: \"kubernetes.io/projected/5188f25b-37c3-46f1-b939-199c6e082848-kube-api-access-sqrqs\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898943 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5pkhq\" (UniqueName: \"kubernetes.io/projected/9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3-kube-api-access-5pkhq\") pod \"machine-config-daemon-66g6d\" (UID: \"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3\") " pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898975 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs\") pod \"network-metrics-daemon-wlq8c\" (UID: \"94c19a90-c2c9-4236-98be-a0516dbb840b\") " pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899012 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/68dcbc21-b4ce-4285-9a4b-101724f82f33-os-release\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899050 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/68dcbc21-b4ce-4285-9a4b-101724f82f33-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899101 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzzhb\" (UniqueName: \"kubernetes.io/projected/94c19a90-c2c9-4236-98be-a0516dbb840b-kube-api-access-lzzhb\") pod \"network-metrics-daemon-wlq8c\" (UID: \"94c19a90-c2c9-4236-98be-a0516dbb840b\") " pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899146 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/68dcbc21-b4ce-4285-9a4b-101724f82f33-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899202 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-kubelet\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899230 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-systemd-units\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899254 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-slash\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899282 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qp9r\" (UniqueName: \"kubernetes.io/projected/ec484e57-1508-45a3-99a3-51dfa8ef6195-kube-api-access-8qp9r\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899318 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5188f25b-37c3-46f1-b939-199c6e082848-cni-binary-copy\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899349 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-var-lib-openvswitch\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899376 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-node-log\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899404 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899431 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovn-node-metrics-cert\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899470 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-etc-openvswitch\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899498 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-ovn\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899523 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-env-overrides\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899552 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovnkube-script-lib\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899589 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-host-slash\") pod \"iptables-alerter-5jnd7\" (UID: \"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\") " pod="openshift-network-operator/iptables-alerter-5jnd7" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899625 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/34177974-8d82-49d2-a763-391d0df3bbd8-host-etc-kube\") pod \"network-operator-7bdcf4f5bd-7fjxv\" (UID: \"34177974-8d82-49d2-a763-391d0df3bbd8\") " pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899751 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6dmhf\" (UniqueName: \"kubernetes.io/projected/736c54fe-349c-4bb9-870a-d1c1d1c03831-kube-api-access-6dmhf\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899776 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tkdh6\" (UniqueName: \"kubernetes.io/projected/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-kube-api-access-tkdh6\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899793 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f65c0ac1-8bca-454d-a2e6-e35cb418beac-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899810 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09cfa50b-4138-4585-a53e-64dd3ab73335-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899824 5116 reconciler_common.go:299] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899840 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899860 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7jjkz\" (UniqueName: \"kubernetes.io/projected/301e1965-1754-483d-b6cc-bfae7038bbca-kube-api-access-7jjkz\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899876 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7afa918d-be67-40a6-803c-d3b0ae99d815-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899892 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ptkcf\" (UniqueName: \"kubernetes.io/projected/7599e0b6-bddf-4def-b7f2-0b32206e8651-kube-api-access-ptkcf\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899909 5116 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a555ff2e-0be6-46d5-897d-863bb92ae2b3-tmp\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899928 5116 reconciler_common.go:299] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899944 5116 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9f71a554-e414-4bc3-96d2-674060397afe-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899958 5116 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899973 5116 reconciler_common.go:299] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/301e1965-1754-483d-b6cc-bfae7038bbca-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899989 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b605f283-6f2e-42da-a838-54421690f7d0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900005 5116 reconciler_common.go:299] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900020 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900037 5116 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c5f2bfad-70f6-4185-a3d9-81ce12720767-tmp-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900056 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900070 5116 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900086 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6rmnv\" (UniqueName: \"kubernetes.io/projected/b605f283-6f2e-42da-a838-54421690f7d0-kube-api-access-6rmnv\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900101 5116 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7afa918d-be67-40a6-803c-d3b0ae99d815-tmp\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900115 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7afa918d-be67-40a6-803c-d3b0ae99d815-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900129 5116 reconciler_common.go:299] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/81e39f7b-62e4-4fc9-992a-6535ce127a02-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900145 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16bdd140-dce1-464c-ab47-dd5798d1d256-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900160 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900217 5116 reconciler_common.go:299] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c491984c-7d4b-44aa-8c1e-d7974424fa47-images\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900314 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/34177974-8d82-49d2-a763-391d0df3bbd8-host-etc-kube\") pod \"network-operator-7bdcf4f5bd-7fjxv\" (UID: \"34177974-8d82-49d2-a763-391d0df3bbd8\") " pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.902869 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-multus-conf-dir\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903359 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-multus-cni-dir\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903412 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d36c245b-3d7f-48eb-848e-c54198ae38a4-tmp-dir\") pod \"node-resolver-nwnjb\" (UID: \"d36c245b-3d7f-48eb-848e-c54198ae38a4\") " pod="openshift-dns/node-resolver-nwnjb" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903432 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-var-lib-cni-bin\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903473 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-hostroot\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903485 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-cnibin\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903523 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-run-multus-certs\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903545 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-multus-socket-dir-parent\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903613 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-run-k8s-cni-cncf-io\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903734 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-system-cni-dir\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903805 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-var-lib-cni-multus\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.905421 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-run-netns\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.905698 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5188f25b-37c3-46f1-b939-199c6e082848-multus-daemon-config\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.906560 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5188f25b-37c3-46f1-b939-199c6e082848-cni-binary-copy\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.906780 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-host-slash\") pod \"iptables-alerter-5jnd7\" (UID: \"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\") " pod="openshift-network-operator/iptables-alerter-5jnd7" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891297 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "567683bd-0efc-4f21-b076-e28559628404" (UID: "567683bd-0efc-4f21-b076-e28559628404"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891369 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891761 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-kube-api-access-ks6v2" (OuterVolumeSpecName: "kube-api-access-ks6v2") pod "71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a" (UID: "71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a"). InnerVolumeSpecName "kube-api-access-ks6v2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891918 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" (UID: "7fcc6409-8a0f-44c3-89e7-5aecd7610f8a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.913504 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-var-lib-kubelet\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.913558 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92dfbade-90b6-4169-8c07-72cff7f2c82b-kube-api-access-4g8ts" (OuterVolumeSpecName: "kube-api-access-4g8ts") pod "92dfbade-90b6-4169-8c07-72cff7f2c82b" (UID: "92dfbade-90b6-4169-8c07-72cff7f2c82b"). InnerVolumeSpecName "kube-api-access-4g8ts". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891967 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92dfbade-90b6-4169-8c07-72cff7f2c82b-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "92dfbade-90b6-4169-8c07-72cff7f2c82b" (UID: "92dfbade-90b6-4169-8c07-72cff7f2c82b"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892126 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09cfa50b-4138-4585-a53e-64dd3ab73335-kube-api-access-zsb9b" (OuterVolumeSpecName: "kube-api-access-zsb9b") pod "09cfa50b-4138-4585-a53e-64dd3ab73335" (UID: "09cfa50b-4138-4585-a53e-64dd3ab73335"). InnerVolumeSpecName "kube-api-access-zsb9b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892270 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-kube-api-access-dztfv" (OuterVolumeSpecName: "kube-api-access-dztfv") pod "d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" (UID: "d45be74c-0d98-4d18-90e4-f7ef1b6daaf7"). InnerVolumeSpecName "kube-api-access-dztfv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892478 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e093be35-bb62-4843-b2e8-094545761610-kube-api-access-pddnv" (OuterVolumeSpecName: "kube-api-access-pddnv") pod "e093be35-bb62-4843-b2e8-094545761610" (UID: "e093be35-bb62-4843-b2e8-094545761610"). InnerVolumeSpecName "kube-api-access-pddnv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892773 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc85e424-18b2-4924-920b-bd291a8c4b01-kube-api-access-xfp5s" (OuterVolumeSpecName: "kube-api-access-xfp5s") pod "cc85e424-18b2-4924-920b-bd291a8c4b01" (UID: "cc85e424-18b2-4924-920b-bd291a8c4b01"). InnerVolumeSpecName "kube-api-access-xfp5s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.913809 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-config" (OuterVolumeSpecName: "config") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.913854 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92dfbade-90b6-4169-8c07-72cff7f2c82b-config-volume" (OuterVolumeSpecName: "config-volume") pod "92dfbade-90b6-4169-8c07-72cff7f2c82b" (UID: "92dfbade-90b6-4169-8c07-72cff7f2c82b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892850 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e9b5059-1b3e-4067-a63d-2952cbe863af-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892868 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce090a97-9ab6-4c40-a719-64ff2acd9778-kube-api-access-xnxbn" (OuterVolumeSpecName: "kube-api-access-xnxbn") pod "ce090a97-9ab6-4c40-a719-64ff2acd9778" (UID: "ce090a97-9ab6-4c40-a719-64ff2acd9778"). InnerVolumeSpecName "kube-api-access-xnxbn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892993 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e9b5059-1b3e-4067-a63d-2952cbe863af-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893051 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-kube-api-access-l9stx" (OuterVolumeSpecName: "kube-api-access-l9stx") pod "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" (UID: "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca"). InnerVolumeSpecName "kube-api-access-l9stx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893203 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-config" (OuterVolumeSpecName: "console-config") pod "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" (UID: "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893677 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7e2c886-118e-43bb-bef1-c78134de392b-kube-api-access-6g4lr" (OuterVolumeSpecName: "kube-api-access-6g4lr") pod "f7e2c886-118e-43bb-bef1-c78134de392b" (UID: "f7e2c886-118e-43bb-bef1-c78134de392b"). InnerVolumeSpecName "kube-api-access-6g4lr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893843 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" (UID: "b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894258 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7df94c10-441d-4386-93a6-6730fb7bcde0-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "7df94c10-441d-4386-93a6-6730fb7bcde0" (UID: "7df94c10-441d-4386-93a6-6730fb7bcde0"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894423 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7afa918d-be67-40a6-803c-d3b0ae99d815-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7afa918d-be67-40a6-803c-d3b0ae99d815" (UID: "7afa918d-be67-40a6-803c-d3b0ae99d815"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894493 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894649 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31fa8943-81cc-4750-a0b7-0fa9ab5af883-utilities" (OuterVolumeSpecName: "utilities") pod "31fa8943-81cc-4750-a0b7-0fa9ab5af883" (UID: "31fa8943-81cc-4750-a0b7-0fa9ab5af883"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894812 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-config" (OuterVolumeSpecName: "config") pod "d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" (UID: "d45be74c-0d98-4d18-90e4-f7ef1b6daaf7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.895071 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42a11a02-47e1-488f-b270-2679d3298b0e-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "42a11a02-47e1-488f-b270-2679d3298b0e" (UID: "42a11a02-47e1-488f-b270-2679d3298b0e"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.895337 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-tmp" (OuterVolumeSpecName: "tmp") pod "b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" (UID: "b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.895560 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01080b46-74f1-4191-8755-5152a57b3b25-kube-api-access-w94wk" (OuterVolumeSpecName: "kube-api-access-w94wk") pod "01080b46-74f1-4191-8755-5152a57b3b25" (UID: "01080b46-74f1-4191-8755-5152a57b3b25"). InnerVolumeSpecName "kube-api-access-w94wk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.896407 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/301e1965-1754-483d-b6cc-bfae7038bbca-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "301e1965-1754-483d-b6cc-bfae7038bbca" (UID: "301e1965-1754-483d-b6cc-bfae7038bbca"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.896440 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/584e1f4a-8205-47d7-8efb-3afc6017c4c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "584e1f4a-8205-47d7-8efb-3afc6017c4c9" (UID: "584e1f4a-8205-47d7-8efb-3afc6017c4c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.895976 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c491984c-7d4b-44aa-8c1e-d7974424fa47-config" (OuterVolumeSpecName: "config") pod "c491984c-7d4b-44aa-8c1e-d7974424fa47" (UID: "c491984c-7d4b-44aa-8c1e-d7974424fa47"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.895667 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01080b46-74f1-4191-8755-5152a57b3b25-config" (OuterVolumeSpecName: "config") pod "01080b46-74f1-4191-8755-5152a57b3b25" (UID: "01080b46-74f1-4191-8755-5152a57b3b25"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.896552 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/149b3c48-e17c-4a66-a835-d86dabf6ff13-utilities" (OuterVolumeSpecName: "utilities") pod "149b3c48-e17c-4a66-a835-d86dabf6ff13" (UID: "149b3c48-e17c-4a66-a835-d86dabf6ff13"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.896659 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f559dfa3-3917-43a2-97f6-61ddfda10e93" (UID: "f559dfa3-3917-43a2-97f6-61ddfda10e93"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897828 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94a6e063-3d1a-4d44-875d-185291448c31-utilities" (OuterVolumeSpecName: "utilities") pod "94a6e063-3d1a-4d44-875d-185291448c31" (UID: "94a6e063-3d1a-4d44-875d-185291448c31"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897889 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898191 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/736c54fe-349c-4bb9-870a-d1c1d1c03831-client-ca" (OuterVolumeSpecName: "client-ca") pod "736c54fe-349c-4bb9-870a-d1c1d1c03831" (UID: "736c54fe-349c-4bb9-870a-d1c1d1c03831"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898263 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4" (UID: "f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898389 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-ca-trust-extracted-pem" (OuterVolumeSpecName: "ca-trust-extracted-pem") pod "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" (UID: "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e"). InnerVolumeSpecName "ca-trust-extracted-pem". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898628 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-client-ca" (OuterVolumeSpecName: "client-ca") pod "a555ff2e-0be6-46d5-897d-863bb92ae2b3" (UID: "a555ff2e-0be6-46d5-897d-863bb92ae2b3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899471 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/149b3c48-e17c-4a66-a835-d86dabf6ff13-kube-api-access-wj4qr" (OuterVolumeSpecName: "kube-api-access-wj4qr") pod "149b3c48-e17c-4a66-a835-d86dabf6ff13" (UID: "149b3c48-e17c-4a66-a835-d86dabf6ff13"). InnerVolumeSpecName "kube-api-access-wj4qr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899584 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94a6e063-3d1a-4d44-875d-185291448c31-kube-api-access-4hb7m" (OuterVolumeSpecName: "kube-api-access-4hb7m") pod "94a6e063-3d1a-4d44-875d-185291448c31" (UID: "94a6e063-3d1a-4d44-875d-185291448c31"). InnerVolumeSpecName "kube-api-access-4hb7m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899604 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "f559dfa3-3917-43a2-97f6-61ddfda10e93" (UID: "f559dfa3-3917-43a2-97f6-61ddfda10e93"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899722 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" (UID: "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899885 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5f2bfad-70f6-4185-a3d9-81ce12720767-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c5f2bfad-70f6-4185-a3d9-81ce12720767" (UID: "c5f2bfad-70f6-4185-a3d9-81ce12720767"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900014 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6077b63e-53a2-4f96-9d56-1ce0324e4913-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "6077b63e-53a2-4f96-9d56-1ce0324e4913" (UID: "6077b63e-53a2-4f96-9d56-1ce0324e4913"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900191 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0effdbcf-dd7d-404d-9d48-77536d665a5d-kube-api-access-mfzkj" (OuterVolumeSpecName: "kube-api-access-mfzkj") pod "0effdbcf-dd7d-404d-9d48-77536d665a5d" (UID: "0effdbcf-dd7d-404d-9d48-77536d665a5d"). InnerVolumeSpecName "kube-api-access-mfzkj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900220 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" (UID: "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900230 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "fc8db2c7-859d-47b3-a900-2bd0c0b2973b" (UID: "fc8db2c7-859d-47b3-a900-2bd0c0b2973b"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900326 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-tmp" (OuterVolumeSpecName: "tmp") pod "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" (UID: "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900860 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-config" (OuterVolumeSpecName: "config") pod "567683bd-0efc-4f21-b076-e28559628404" (UID: "567683bd-0efc-4f21-b076-e28559628404"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.900907 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:23.400884681 +0000 UTC m=+94.423186054 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.914538 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/593a3561-7760-45c5-8f91-5aaef7475d0f-kube-api-access-sbc2l" (OuterVolumeSpecName: "kube-api-access-sbc2l") pod "593a3561-7760-45c5-8f91-5aaef7475d0f" (UID: "593a3561-7760-45c5-8f91-5aaef7475d0f"). InnerVolumeSpecName "kube-api-access-sbc2l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900901 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-utilities" (OuterVolumeSpecName: "utilities") pod "9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff" (UID: "9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900915 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f559dfa3-3917-43a2-97f6-61ddfda10e93" (UID: "f559dfa3-3917-43a2-97f6-61ddfda10e93"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.901107 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a52afe44-fb37-46ed-a1f8-bf39727a3cbe-cert" (OuterVolumeSpecName: "cert") pod "a52afe44-fb37-46ed-a1f8-bf39727a3cbe" (UID: "a52afe44-fb37-46ed-a1f8-bf39727a3cbe"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.901541 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f2bfad-70f6-4185-a3d9-81ce12720767-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c5f2bfad-70f6-4185-a3d9-81ce12720767" (UID: "c5f2bfad-70f6-4185-a3d9-81ce12720767"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.901547 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/736c54fe-349c-4bb9-870a-d1c1d1c03831-tmp" (OuterVolumeSpecName: "tmp") pod "736c54fe-349c-4bb9-870a-d1c1d1c03831" (UID: "736c54fe-349c-4bb9-870a-d1c1d1c03831"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900832 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18f80adb-c1c3-49ba-8ee4-932c851d3897-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "18f80adb-c1c3-49ba-8ee4-932c851d3897" (UID: "18f80adb-c1c3-49ba-8ee4-932c851d3897"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.901657 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" (UID: "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.901527 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "af33e427-6803-48c2-a76a-dd9deb7cbf9a" (UID: "af33e427-6803-48c2-a76a-dd9deb7cbf9a"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.901695 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7e8f42f-dc0e-424b-bb56-5ec849834888-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d7e8f42f-dc0e-424b-bb56-5ec849834888" (UID: "d7e8f42f-dc0e-424b-bb56-5ec849834888"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.901980 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c491984c-7d4b-44aa-8c1e-d7974424fa47-kube-api-access-9vsz9" (OuterVolumeSpecName: "kube-api-access-9vsz9") pod "c491984c-7d4b-44aa-8c1e-d7974424fa47" (UID: "c491984c-7d4b-44aa-8c1e-d7974424fa47"). InnerVolumeSpecName "kube-api-access-9vsz9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.901986 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a555ff2e-0be6-46d5-897d-863bb92ae2b3-kube-api-access-8pskd" (OuterVolumeSpecName: "kube-api-access-8pskd") pod "a555ff2e-0be6-46d5-897d-863bb92ae2b3" (UID: "a555ff2e-0be6-46d5-897d-863bb92ae2b3"). InnerVolumeSpecName "kube-api-access-8pskd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.901983 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "f559dfa3-3917-43a2-97f6-61ddfda10e93" (UID: "f559dfa3-3917-43a2-97f6-61ddfda10e93"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.901999 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-audit" (OuterVolumeSpecName: "audit") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.902012 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f65c0ac1-8bca-454d-a2e6-e35cb418beac-config" (OuterVolumeSpecName: "config") pod "f65c0ac1-8bca-454d-a2e6-e35cb418beac" (UID: "f65c0ac1-8bca-454d-a2e6-e35cb418beac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.914670 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4750666-1362-4001-abd0-6f89964cc621-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b4750666-1362-4001-abd0-6f89964cc621" (UID: "b4750666-1362-4001-abd0-6f89964cc621"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.901862 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c491984c-7d4b-44aa-8c1e-d7974424fa47-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "c491984c-7d4b-44aa-8c1e-d7974424fa47" (UID: "c491984c-7d4b-44aa-8c1e-d7974424fa47"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.902389 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/567683bd-0efc-4f21-b076-e28559628404-kube-api-access-m26jq" (OuterVolumeSpecName: "kube-api-access-m26jq") pod "567683bd-0efc-4f21-b076-e28559628404" (UID: "567683bd-0efc-4f21-b076-e28559628404"). InnerVolumeSpecName "kube-api-access-m26jq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.902435 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-kube-api-access-xxfcv" (OuterVolumeSpecName: "kube-api-access-xxfcv") pod "9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff" (UID: "9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff"). InnerVolumeSpecName "kube-api-access-xxfcv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.902548 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f71a554-e414-4bc3-96d2-674060397afe-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9f71a554-e414-4bc3-96d2-674060397afe" (UID: "9f71a554-e414-4bc3-96d2-674060397afe"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.902714 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ee8fbd3-1f81-4666-96da-5afc70819f1a-kube-api-access-d4tqq" (OuterVolumeSpecName: "kube-api-access-d4tqq") pod "6ee8fbd3-1f81-4666-96da-5afc70819f1a" (UID: "6ee8fbd3-1f81-4666-96da-5afc70819f1a"). InnerVolumeSpecName "kube-api-access-d4tqq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.902614 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7a88189-c967-4640-879e-27665747f20c-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "a7a88189-c967-4640-879e-27665747f20c" (UID: "a7a88189-c967-4640-879e-27665747f20c"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.902853 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7e8f42f-dc0e-424b-bb56-5ec849834888-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d7e8f42f-dc0e-424b-bb56-5ec849834888" (UID: "d7e8f42f-dc0e-424b-bb56-5ec849834888"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903032 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f71a554-e414-4bc3-96d2-674060397afe-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "9f71a554-e414-4bc3-96d2-674060397afe" (UID: "9f71a554-e414-4bc3-96d2-674060397afe"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903066 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "af33e427-6803-48c2-a76a-dd9deb7cbf9a" (UID: "af33e427-6803-48c2-a76a-dd9deb7cbf9a"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903089 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f71a554-e414-4bc3-96d2-674060397afe-kube-api-access-ftwb6" (OuterVolumeSpecName: "kube-api-access-ftwb6") pod "9f71a554-e414-4bc3-96d2-674060397afe" (UID: "9f71a554-e414-4bc3-96d2-674060397afe"). InnerVolumeSpecName "kube-api-access-ftwb6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903150 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a555ff2e-0be6-46d5-897d-863bb92ae2b3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a555ff2e-0be6-46d5-897d-863bb92ae2b3" (UID: "a555ff2e-0be6-46d5-897d-863bb92ae2b3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.914784 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7e8f42f-dc0e-424b-bb56-5ec849834888-service-ca" (OuterVolumeSpecName: "service-ca") pod "d7e8f42f-dc0e-424b-bb56-5ec849834888" (UID: "d7e8f42f-dc0e-424b-bb56-5ec849834888"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.914808 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42a11a02-47e1-488f-b270-2679d3298b0e-kube-api-access-qgrkj" (OuterVolumeSpecName: "kube-api-access-qgrkj") pod "42a11a02-47e1-488f-b270-2679d3298b0e" (UID: "42a11a02-47e1-488f-b270-2679d3298b0e"). InnerVolumeSpecName "kube-api-access-qgrkj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903379 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/567683bd-0efc-4f21-b076-e28559628404-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "567683bd-0efc-4f21-b076-e28559628404" (UID: "567683bd-0efc-4f21-b076-e28559628404"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903395 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" (UID: "7fcc6409-8a0f-44c3-89e7-5aecd7610f8a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903393 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5f2bfad-70f6-4185-a3d9-81ce12720767-config" (OuterVolumeSpecName: "config") pod "c5f2bfad-70f6-4185-a3d9-81ce12720767" (UID: "c5f2bfad-70f6-4185-a3d9-81ce12720767"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903559 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-kube-api-access-ddlk9" (OuterVolumeSpecName: "kube-api-access-ddlk9") pod "7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" (UID: "7fcc6409-8a0f-44c3-89e7-5aecd7610f8a"). InnerVolumeSpecName "kube-api-access-ddlk9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.904296 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" (UID: "b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.904136 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b605f283-6f2e-42da-a838-54421690f7d0-utilities" (OuterVolumeSpecName: "utilities") pod "b605f283-6f2e-42da-a838-54421690f7d0" (UID: "b605f283-6f2e-42da-a838-54421690f7d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.904359 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-kube-api-access-5lcfw" (OuterVolumeSpecName: "kube-api-access-5lcfw") pod "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" (UID: "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9"). InnerVolumeSpecName "kube-api-access-5lcfw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.904374 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "18f80adb-c1c3-49ba-8ee4-932c851d3897" (UID: "18f80adb-c1c3-49ba-8ee4-932c851d3897"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.904400 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "f559dfa3-3917-43a2-97f6-61ddfda10e93" (UID: "f559dfa3-3917-43a2-97f6-61ddfda10e93"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.908449 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "af33e427-6803-48c2-a76a-dd9deb7cbf9a" (UID: "af33e427-6803-48c2-a76a-dd9deb7cbf9a"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.908755 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/593a3561-7760-45c5-8f91-5aaef7475d0f-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "593a3561-7760-45c5-8f91-5aaef7475d0f" (UID: "593a3561-7760-45c5-8f91-5aaef7475d0f"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.909095 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "567683bd-0efc-4f21-b076-e28559628404" (UID: "567683bd-0efc-4f21-b076-e28559628404"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.909146 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2325ffef-9d5b-447f-b00e-3efc429acefe-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "2325ffef-9d5b-447f-b00e-3efc429acefe" (UID: "2325ffef-9d5b-447f-b00e-3efc429acefe"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.909462 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af41de71-79cf-4590-bbe9-9e8b848862cb-kube-api-access-d7cps" (OuterVolumeSpecName: "kube-api-access-d7cps") pod "af41de71-79cf-4590-bbe9-9e8b848862cb" (UID: "af41de71-79cf-4590-bbe9-9e8b848862cb"). InnerVolumeSpecName "kube-api-access-d7cps". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.909919 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4-kube-api-access-pgx6b" (OuterVolumeSpecName: "kube-api-access-pgx6b") pod "f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4" (UID: "f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4"). InnerVolumeSpecName "kube-api-access-pgx6b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.909881 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-kube-api-access-qqbfk" (OuterVolumeSpecName: "kube-api-access-qqbfk") pod "b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" (UID: "b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a"). InnerVolumeSpecName "kube-api-access-qqbfk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.909976 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7df94c10-441d-4386-93a6-6730fb7bcde0-kube-api-access-nmmzf" (OuterVolumeSpecName: "kube-api-access-nmmzf") pod "7df94c10-441d-4386-93a6-6730fb7bcde0" (UID: "7df94c10-441d-4386-93a6-6730fb7bcde0"). InnerVolumeSpecName "kube-api-access-nmmzf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.910023 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81e39f7b-62e4-4fc9-992a-6535ce127a02-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "81e39f7b-62e4-4fc9-992a-6535ce127a02" (UID: "81e39f7b-62e4-4fc9-992a-6535ce127a02"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.910296 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d19cb085-0c5b-4810-b654-ce7923221d90-kube-api-access-m5lgh" (OuterVolumeSpecName: "kube-api-access-m5lgh") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "kube-api-access-m5lgh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.910371 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2325ffef-9d5b-447f-b00e-3efc429acefe-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2325ffef-9d5b-447f-b00e-3efc429acefe" (UID: "2325ffef-9d5b-447f-b00e-3efc429acefe"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.910391 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31fa8943-81cc-4750-a0b7-0fa9ab5af883-kube-api-access-grwfz" (OuterVolumeSpecName: "kube-api-access-grwfz") pod "31fa8943-81cc-4750-a0b7-0fa9ab5af883" (UID: "31fa8943-81cc-4750-a0b7-0fa9ab5af883"). InnerVolumeSpecName "kube-api-access-grwfz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.910667 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-kube-api-access-hckvg" (OuterVolumeSpecName: "kube-api-access-hckvg") pod "fc8db2c7-859d-47b3-a900-2bd0c0b2973b" (UID: "fc8db2c7-859d-47b3-a900-2bd0c0b2973b"). InnerVolumeSpecName "kube-api-access-hckvg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.910711 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" (UID: "d45be74c-0d98-4d18-90e4-f7ef1b6daaf7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.910708 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2325ffef-9d5b-447f-b00e-3efc429acefe-config" (OuterVolumeSpecName: "config") pod "2325ffef-9d5b-447f-b00e-3efc429acefe" (UID: "2325ffef-9d5b-447f-b00e-3efc429acefe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.910743 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7599e0b6-bddf-4def-b7f2-0b32206e8651-config" (OuterVolumeSpecName: "config") pod "7599e0b6-bddf-4def-b7f2-0b32206e8651" (UID: "7599e0b6-bddf-4def-b7f2-0b32206e8651"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.911069 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16bdd140-dce1-464c-ab47-dd5798d1d256-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "16bdd140-dce1-464c-ab47-dd5798d1d256" (UID: "16bdd140-dce1-464c-ab47-dd5798d1d256"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.911065 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce090a97-9ab6-4c40-a719-64ff2acd9778-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "ce090a97-9ab6-4c40-a719-64ff2acd9778" (UID: "ce090a97-9ab6-4c40-a719-64ff2acd9778"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.911083 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ebfebf6-3ecd-458e-943f-bb25b52e2718-kube-api-access-l87hs" (OuterVolumeSpecName: "kube-api-access-l87hs") pod "5ebfebf6-3ecd-458e-943f-bb25b52e2718" (UID: "5ebfebf6-3ecd-458e-943f-bb25b52e2718"). InnerVolumeSpecName "kube-api-access-l87hs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.911258 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7df94c10-441d-4386-93a6-6730fb7bcde0-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "7df94c10-441d-4386-93a6-6730fb7bcde0" (UID: "7df94c10-441d-4386-93a6-6730fb7bcde0"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.911434 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-kube-api-access-9z4sw" (OuterVolumeSpecName: "kube-api-access-9z4sw") pod "e1d2a42d-af1d-4054-9618-ab545e0ed8b7" (UID: "e1d2a42d-af1d-4054-9618-ab545e0ed8b7"). InnerVolumeSpecName "kube-api-access-9z4sw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.911454 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01080b46-74f1-4191-8755-5152a57b3b25-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01080b46-74f1-4191-8755-5152a57b3b25" (UID: "01080b46-74f1-4191-8755-5152a57b3b25"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.911755 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/869851b9-7ffb-4af0-b166-1d8aa40a5f80-kube-api-access-mjwtd" (OuterVolumeSpecName: "kube-api-access-mjwtd") pod "869851b9-7ffb-4af0-b166-1d8aa40a5f80" (UID: "869851b9-7ffb-4af0-b166-1d8aa40a5f80"). InnerVolumeSpecName "kube-api-access-mjwtd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.911824 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ebfebf6-3ecd-458e-943f-bb25b52e2718-serviceca" (OuterVolumeSpecName: "serviceca") pod "5ebfebf6-3ecd-458e-943f-bb25b52e2718" (UID: "5ebfebf6-3ecd-458e-943f-bb25b52e2718"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.911985 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16bdd140-dce1-464c-ab47-dd5798d1d256-kube-api-access-94l9h" (OuterVolumeSpecName: "kube-api-access-94l9h") pod "16bdd140-dce1-464c-ab47-dd5798d1d256" (UID: "16bdd140-dce1-464c-ab47-dd5798d1d256"). InnerVolumeSpecName "kube-api-access-94l9h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.912425 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6077b63e-53a2-4f96-9d56-1ce0324e4913-kube-api-access-zth6t" (OuterVolumeSpecName: "kube-api-access-zth6t") pod "6077b63e-53a2-4f96-9d56-1ce0324e4913" (UID: "6077b63e-53a2-4f96-9d56-1ce0324e4913"). InnerVolumeSpecName "kube-api-access-zth6t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.912397 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92dfbade-90b6-4169-8c07-72cff7f2c82b-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "92dfbade-90b6-4169-8c07-72cff7f2c82b" (UID: "92dfbade-90b6-4169-8c07-72cff7f2c82b"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.912510 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd0fbac-8c0d-4228-8faa-abbeedabf7db-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "0dd0fbac-8c0d-4228-8faa-abbeedabf7db" (UID: "0dd0fbac-8c0d-4228-8faa-abbeedabf7db"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.912464 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.912595 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "869851b9-7ffb-4af0-b166-1d8aa40a5f80" (UID: "869851b9-7ffb-4af0-b166-1d8aa40a5f80"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.912580 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "e1d2a42d-af1d-4054-9618-ab545e0ed8b7" (UID: "e1d2a42d-af1d-4054-9618-ab545e0ed8b7"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.912686 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4750666-1362-4001-abd0-6f89964cc621-kube-api-access-twvbl" (OuterVolumeSpecName: "kube-api-access-twvbl") pod "b4750666-1362-4001-abd0-6f89964cc621" (UID: "b4750666-1362-4001-abd0-6f89964cc621"). InnerVolumeSpecName "kube-api-access-twvbl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.913121 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc85e424-18b2-4924-920b-bd291a8c4b01-utilities" (OuterVolumeSpecName: "utilities") pod "cc85e424-18b2-4924-920b-bd291a8c4b01" (UID: "cc85e424-18b2-4924-920b-bd291a8c4b01"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.913269 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/736c54fe-349c-4bb9-870a-d1c1d1c03831-config" (OuterVolumeSpecName: "config") pod "736c54fe-349c-4bb9-870a-d1c1d1c03831" (UID: "736c54fe-349c-4bb9-870a-d1c1d1c03831"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.915534 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d565531a-ff86-4608-9d19-767de01ac31b-kube-api-access-99zj9" (OuterVolumeSpecName: "kube-api-access-99zj9") pod "d565531a-ff86-4608-9d19-767de01ac31b" (UID: "d565531a-ff86-4608-9d19-767de01ac31b"). InnerVolumeSpecName "kube-api-access-99zj9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.915954 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d565531a-ff86-4608-9d19-767de01ac31b-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "d565531a-ff86-4608-9d19-767de01ac31b" (UID: "d565531a-ff86-4608-9d19-767de01ac31b"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.916035 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3-proxy-tls\") pod \"machine-config-daemon-66g6d\" (UID: \"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3\") " pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.916052 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3-rootfs\") pod \"machine-config-daemon-66g6d\" (UID: \"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3\") " pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.916447 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "869851b9-7ffb-4af0-b166-1d8aa40a5f80" (UID: "869851b9-7ffb-4af0-b166-1d8aa40a5f80"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.916564 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.917024 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-utilities" (OuterVolumeSpecName: "utilities") pod "71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a" (UID: "71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.917341 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2325ffef-9d5b-447f-b00e-3efc429acefe-kube-api-access-zg8nc" (OuterVolumeSpecName: "kube-api-access-zg8nc") pod "2325ffef-9d5b-447f-b00e-3efc429acefe" (UID: "2325ffef-9d5b-447f-b00e-3efc429acefe"). InnerVolumeSpecName "kube-api-access-zg8nc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.917609 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff" (UID: "9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.917771 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81e39f7b-62e4-4fc9-992a-6535ce127a02-kube-api-access-pllx6" (OuterVolumeSpecName: "kube-api-access-pllx6") pod "81e39f7b-62e4-4fc9-992a-6535ce127a02" (UID: "81e39f7b-62e4-4fc9-992a-6535ce127a02"). InnerVolumeSpecName "kube-api-access-pllx6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.918088 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e1d2a42d-af1d-4054-9618-ab545e0ed8b7" (UID: "e1d2a42d-af1d-4054-9618-ab545e0ed8b7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.918243 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f65c0ac1-8bca-454d-a2e6-e35cb418beac-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f65c0ac1-8bca-454d-a2e6-e35cb418beac" (UID: "f65c0ac1-8bca-454d-a2e6-e35cb418beac"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.918413 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7599e0b6-bddf-4def-b7f2-0b32206e8651-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7599e0b6-bddf-4def-b7f2-0b32206e8651" (UID: "7599e0b6-bddf-4def-b7f2-0b32206e8651"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.918691 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc85e424-18b2-4924-920b-bd291a8c4b01-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc85e424-18b2-4924-920b-bd291a8c4b01" (UID: "cc85e424-18b2-4924-920b-bd291a8c4b01"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.918746 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/584e1f4a-8205-47d7-8efb-3afc6017c4c9-utilities" (OuterVolumeSpecName: "utilities") pod "584e1f4a-8205-47d7-8efb-3afc6017c4c9" (UID: "584e1f4a-8205-47d7-8efb-3afc6017c4c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.919320 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.919410 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d565531a-ff86-4608-9d19-767de01ac31b-images" (OuterVolumeSpecName: "images") pod "d565531a-ff86-4608-9d19-767de01ac31b" (UID: "d565531a-ff86-4608-9d19-767de01ac31b"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.919547 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/567683bd-0efc-4f21-b076-e28559628404-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "567683bd-0efc-4f21-b076-e28559628404" (UID: "567683bd-0efc-4f21-b076-e28559628404"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.919788 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f65c0ac1-8bca-454d-a2e6-e35cb418beac-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "f65c0ac1-8bca-454d-a2e6-e35cb418beac" (UID: "f65c0ac1-8bca-454d-a2e6-e35cb418beac"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.919887 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6077b63e-53a2-4f96-9d56-1ce0324e4913-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "6077b63e-53a2-4f96-9d56-1ce0324e4913" (UID: "6077b63e-53a2-4f96-9d56-1ce0324e4913"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.919916 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-os-release\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.919959 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-etc-kubernetes\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.920261 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d36c245b-3d7f-48eb-848e-c54198ae38a4-hosts-file\") pod \"node-resolver-nwnjb\" (UID: \"d36c245b-3d7f-48eb-848e-c54198ae38a4\") " pod="openshift-dns/node-resolver-nwnjb" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.920241 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b44f09af-f7e2-4bcb-bdba-55ff8b81f5de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"},\\\"containerID\\\":\\\"cri-o://5c4690216bec52133687895e582b4bb77c5e98aa64b321ad9c9d506f55d3f00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:54Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://2600f5226a6713d12505ffd457e56ebd7d088161c72b9c8559bbea8a975297de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:54Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://79bb9087b191778249a1a55111db0643e3b60d19f757f975df02d11b33c9c101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:54Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://bfe93ca3eb874b172d7e5a437f62f0a3794ec8d1de985eadc78f9c5dc076e580\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:54Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://88776b5f47aa00647e827ac8c99416f29940ce9d6589329425a92597ab26ec51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:53Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://18580e6f77485dbbf6c9384d7b066de66b1281944199ad05eb32241a77f1b8db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18580e6f77485dbbf6c9384d7b066de66b1281944199ad05eb32241a77f1b8db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:50Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd-auto-backup\\\",\\\"name\\\":\\\"etcd-auto-backup-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://b8c90b6c961bb202b828dc8ac3d4da1dd89cbe82a5400089a87594b9666234de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8c90b6c961bb202b828dc8ac3d4da1dd89cbe82a5400089a87594b9666234de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:51Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://c9fbf4e89cefa264fde65048c1a9bba1b6eafe0e57717eb8d9a23e9a445b8195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9fbf4e89cefa264fde65048c1a9bba1b6eafe0e57717eb8d9a23e9a445b8195\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:08:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.920594 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.920728 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3-mcd-auth-proxy-config\") pod \"machine-config-daemon-66g6d\" (UID: \"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3\") " pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.921071 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a52afe44-fb37-46ed-a1f8-bf39727a3cbe-kube-api-access-rzt4w" (OuterVolumeSpecName: "kube-api-access-rzt4w") pod "a52afe44-fb37-46ed-a1f8-bf39727a3cbe" (UID: "a52afe44-fb37-46ed-a1f8-bf39727a3cbe"). InnerVolumeSpecName "kube-api-access-rzt4w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.921125 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7df94c10-441d-4386-93a6-6730fb7bcde0-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "7df94c10-441d-4386-93a6-6730fb7bcde0" (UID: "7df94c10-441d-4386-93a6-6730fb7bcde0"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.921325 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/593a3561-7760-45c5-8f91-5aaef7475d0f-certs" (OuterVolumeSpecName: "certs") pod "593a3561-7760-45c5-8f91-5aaef7475d0f" (UID: "593a3561-7760-45c5-8f91-5aaef7475d0f"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.921361 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/567683bd-0efc-4f21-b076-e28559628404-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "567683bd-0efc-4f21-b076-e28559628404" (UID: "567683bd-0efc-4f21-b076-e28559628404"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.921408 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7a88189-c967-4640-879e-27665747f20c-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "a7a88189-c967-4640-879e-27665747f20c" (UID: "a7a88189-c967-4640-879e-27665747f20c"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.921632 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce090a97-9ab6-4c40-a719-64ff2acd9778-signing-key" (OuterVolumeSpecName: "signing-key") pod "ce090a97-9ab6-4c40-a719-64ff2acd9778" (UID: "ce090a97-9ab6-4c40-a719-64ff2acd9778"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.921879 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.922323 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.922482 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7a88189-c967-4640-879e-27665747f20c-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "a7a88189-c967-4640-879e-27665747f20c" (UID: "a7a88189-c967-4640-879e-27665747f20c"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.922640 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" (UID: "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.921533 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" (UID: "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.923796 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f559dfa3-3917-43a2-97f6-61ddfda10e93-kube-api-access-hm9x7" (OuterVolumeSpecName: "kube-api-access-hm9x7") pod "f559dfa3-3917-43a2-97f6-61ddfda10e93" (UID: "f559dfa3-3917-43a2-97f6-61ddfda10e93"). InnerVolumeSpecName "kube-api-access-hm9x7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.924400 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6edfcf45-925b-4eff-b940-95b6fc0b85d4-kube-api-access-8nb9c" (OuterVolumeSpecName: "kube-api-access-8nb9c") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "kube-api-access-8nb9c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.930939 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4750666-1362-4001-abd0-6f89964cc621-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "b4750666-1362-4001-abd0-6f89964cc621" (UID: "b4750666-1362-4001-abd0-6f89964cc621"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.931578 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.932119 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" (UID: "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.932152 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ee8fbd3-1f81-4666-96da-5afc70819f1a-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "6ee8fbd3-1f81-4666-96da-5afc70819f1a" (UID: "6ee8fbd3-1f81-4666-96da-5afc70819f1a"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.932184 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dd0fbac-8c0d-4228-8faa-abbeedabf7db-kube-api-access-q4smf" (OuterVolumeSpecName: "kube-api-access-q4smf") pod "0dd0fbac-8c0d-4228-8faa-abbeedabf7db" (UID: "0dd0fbac-8c0d-4228-8faa-abbeedabf7db"). InnerVolumeSpecName "kube-api-access-q4smf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.932511 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.932794 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d565531a-ff86-4608-9d19-767de01ac31b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d565531a-ff86-4608-9d19-767de01ac31b" (UID: "d565531a-ff86-4608-9d19-767de01ac31b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.932807 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.932876 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.933104 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pkhq\" (UniqueName: \"kubernetes.io/projected/9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3-kube-api-access-5pkhq\") pod \"machine-config-daemon-66g6d\" (UID: \"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3\") " pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.933264 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/584e1f4a-8205-47d7-8efb-3afc6017c4c9-kube-api-access-tknt7" (OuterVolumeSpecName: "kube-api-access-tknt7") pod "584e1f4a-8205-47d7-8efb-3afc6017c4c9" (UID: "584e1f4a-8205-47d7-8efb-3afc6017c4c9"). InnerVolumeSpecName "kube-api-access-tknt7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.933337 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqrqs\" (UniqueName: \"kubernetes.io/projected/5188f25b-37c3-46f1-b939-199c6e082848-kube-api-access-sqrqs\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.934568 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7e2c886-118e-43bb-bef1-c78134de392b-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "f7e2c886-118e-43bb-bef1-c78134de392b" (UID: "f7e2c886-118e-43bb-bef1-c78134de392b"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.934570 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/301e1965-1754-483d-b6cc-bfae7038bbca-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "301e1965-1754-483d-b6cc-bfae7038bbca" (UID: "301e1965-1754-483d-b6cc-bfae7038bbca"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.934803 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18f80adb-c1c3-49ba-8ee4-932c851d3897-kube-api-access-wbmqg" (OuterVolumeSpecName: "kube-api-access-wbmqg") pod "18f80adb-c1c3-49ba-8ee4-932c851d3897" (UID: "18f80adb-c1c3-49ba-8ee4-932c851d3897"). InnerVolumeSpecName "kube-api-access-wbmqg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.934924 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af33e427-6803-48c2-a76a-dd9deb7cbf9a-kube-api-access-z5rsr" (OuterVolumeSpecName: "kube-api-access-z5rsr") pod "af33e427-6803-48c2-a76a-dd9deb7cbf9a" (UID: "af33e427-6803-48c2-a76a-dd9deb7cbf9a"). InnerVolumeSpecName "kube-api-access-z5rsr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.935179 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17b87002-b798-480a-8e17-83053d698239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwt8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-fhkjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.935354 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "18f80adb-c1c3-49ba-8ee4-932c851d3897" (UID: "18f80adb-c1c3-49ba-8ee4-932c851d3897"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.935475 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/736c54fe-349c-4bb9-870a-d1c1d1c03831-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "736c54fe-349c-4bb9-870a-d1c1d1c03831" (UID: "736c54fe-349c-4bb9-870a-d1c1d1c03831"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.935428 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" (UID: "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.935779 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.935817 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.935821 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.935833 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-tmp" (OuterVolumeSpecName: "tmp") pod "d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" (UID: "d45be74c-0d98-4d18-90e4-f7ef1b6daaf7"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.935831 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.935897 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-whereabouts-flatfile-configmap" (OuterVolumeSpecName: "whereabouts-flatfile-configmap") pod "869851b9-7ffb-4af0-b166-1d8aa40a5f80" (UID: "869851b9-7ffb-4af0-b166-1d8aa40a5f80"). InnerVolumeSpecName "whereabouts-flatfile-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.936148 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.936225 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:22Z","lastTransitionTime":"2026-03-22T00:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.936601 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-kube-api-access-ws8zz" (OuterVolumeSpecName: "kube-api-access-ws8zz") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af"). InnerVolumeSpecName "kube-api-access-ws8zz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.936603 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-config" (OuterVolumeSpecName: "config") pod "a555ff2e-0be6-46d5-897d-863bb92ae2b3" (UID: "a555ff2e-0be6-46d5-897d-863bb92ae2b3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.937092 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09cfa50b-4138-4585-a53e-64dd3ab73335-config" (OuterVolumeSpecName: "config") pod "09cfa50b-4138-4585-a53e-64dd3ab73335" (UID: "09cfa50b-4138-4585-a53e-64dd3ab73335"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.937289 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a208c9c2-333b-4b4a-be0d-bc32ec38a821-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "a208c9c2-333b-4b4a-be0d-bc32ec38a821" (UID: "a208c9c2-333b-4b4a-be0d-bc32ec38a821"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.937384 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a555ff2e-0be6-46d5-897d-863bb92ae2b3" (UID: "a555ff2e-0be6-46d5-897d-863bb92ae2b3"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.937505 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "f559dfa3-3917-43a2-97f6-61ddfda10e93" (UID: "f559dfa3-3917-43a2-97f6-61ddfda10e93"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.937584 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e9b5059-1b3e-4067-a63d-2952cbe863af-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.937657 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a208c9c2-333b-4b4a-be0d-bc32ec38a821-kube-api-access-26xrl" (OuterVolumeSpecName: "kube-api-access-26xrl") pod "a208c9c2-333b-4b4a-be0d-bc32ec38a821" (UID: "a208c9c2-333b-4b4a-be0d-bc32ec38a821"). InnerVolumeSpecName "kube-api-access-26xrl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.937659 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-service-ca" (OuterVolumeSpecName: "service-ca") pod "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" (UID: "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.937961 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "18f80adb-c1c3-49ba-8ee4-932c851d3897" (UID: "18f80adb-c1c3-49ba-8ee4-932c851d3897"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.938073 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5zcr\" (UniqueName: \"kubernetes.io/projected/d36c245b-3d7f-48eb-848e-c54198ae38a4-kube-api-access-x5zcr\") pod \"node-resolver-nwnjb\" (UID: \"d36c245b-3d7f-48eb-848e-c54198ae38a4\") " pod="openshift-dns/node-resolver-nwnjb" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.938418 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" (UID: "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.939775 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "fc8db2c7-859d-47b3-a900-2bd0c0b2973b" (UID: "fc8db2c7-859d-47b3-a900-2bd0c0b2973b"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.939909 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.939931 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.940658 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.941235 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/149b3c48-e17c-4a66-a835-d86dabf6ff13-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "149b3c48-e17c-4a66-a835-d86dabf6ff13" (UID: "149b3c48-e17c-4a66-a835-d86dabf6ff13"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.951183 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31fa8943-81cc-4750-a0b7-0fa9ab5af883-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31fa8943-81cc-4750-a0b7-0fa9ab5af883" (UID: "31fa8943-81cc-4750-a0b7-0fa9ab5af883"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.951751 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc4541ce-7789-4670-bc75-5c2868e52ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-dgvkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.961182 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.965372 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/multus-9sq6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5188f25b-37c3-46f1-b939-199c6e082848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9sq6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.967158 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a" (UID: "71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.968925 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-dgvkt" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.977438 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e17ab744-68a7-4a24-8ef2-556696d752fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntdv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntdv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-57b78d8988-bd7p4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.980359 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5jnd7" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.982726 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e9b5059-1b3e-4067-a63d-2952cbe863af-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.986191 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2rwjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1811891e-33d0-4500-a481-0e4aa2d3e95c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qscfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2rwjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.987247 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94a6e063-3d1a-4d44-875d-185291448c31-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94a6e063-3d1a-4d44-875d-185291448c31" (UID: "94a6e063-3d1a-4d44-875d-185291448c31"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.994723 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-nwnjb" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.000963 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/68dcbc21-b4ce-4285-9a4b-101724f82f33-os-release\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.000997 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/68dcbc21-b4ce-4285-9a4b-101724f82f33-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001016 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lzzhb\" (UniqueName: \"kubernetes.io/projected/94c19a90-c2c9-4236-98be-a0516dbb840b-kube-api-access-lzzhb\") pod \"network-metrics-daemon-wlq8c\" (UID: \"94c19a90-c2c9-4236-98be-a0516dbb840b\") " pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001037 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/68dcbc21-b4ce-4285-9a4b-101724f82f33-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001056 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-kubelet\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001074 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-systemd-units\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001093 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-slash\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001094 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/68dcbc21-b4ce-4285-9a4b-101724f82f33-os-release\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001112 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8qp9r\" (UniqueName: \"kubernetes.io/projected/ec484e57-1508-45a3-99a3-51dfa8ef6195-kube-api-access-8qp9r\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001239 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-var-lib-openvswitch\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001282 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-node-log\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001320 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001352 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovn-node-metrics-cert\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001383 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/68dcbc21-b4ce-4285-9a4b-101724f82f33-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001393 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-etc-openvswitch\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001424 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-ovn\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001456 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-env-overrides\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001487 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovnkube-script-lib\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001533 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qscfn\" (UniqueName: \"kubernetes.io/projected/1811891e-33d0-4500-a481-0e4aa2d3e95c-kube-api-access-qscfn\") pod \"node-ca-2rwjp\" (UID: \"1811891e-33d0-4500-a481-0e4aa2d3e95c\") " pod="openshift-image-registry/node-ca-2rwjp" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001565 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovnkube-config\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001603 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e17ab744-68a7-4a24-8ef2-556696d752fb-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57b78d8988-bd7p4\" (UID: \"e17ab744-68a7-4a24-8ef2-556696d752fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001639 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ntdv4\" (UniqueName: \"kubernetes.io/projected/e17ab744-68a7-4a24-8ef2-556696d752fb-kube-api-access-ntdv4\") pod \"ovnkube-control-plane-57b78d8988-bd7p4\" (UID: \"e17ab744-68a7-4a24-8ef2-556696d752fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001681 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1811891e-33d0-4500-a481-0e4aa2d3e95c-serviceca\") pod \"node-ca-2rwjp\" (UID: \"1811891e-33d0-4500-a481-0e4aa2d3e95c\") " pod="openshift-image-registry/node-ca-2rwjp" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001721 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-run-netns\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001787 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/68dcbc21-b4ce-4285-9a4b-101724f82f33-system-cni-dir\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001830 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/68dcbc21-b4ce-4285-9a4b-101724f82f33-cnibin\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001864 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1811891e-33d0-4500-a481-0e4aa2d3e95c-host\") pod \"node-ca-2rwjp\" (UID: \"1811891e-33d0-4500-a481-0e4aa2d3e95c\") " pod="openshift-image-registry/node-ca-2rwjp" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001871 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-systemd-units\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001896 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-log-socket\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001946 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-kubelet\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001950 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-log-socket\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.002010 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1811891e-33d0-4500-a481-0e4aa2d3e95c-host\") pod \"node-ca-2rwjp\" (UID: \"1811891e-33d0-4500-a481-0e4aa2d3e95c\") " pod="openshift-image-registry/node-ca-2rwjp" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.002018 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-node-log\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.002025 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/68dcbc21-b4ce-4285-9a4b-101724f82f33-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.002050 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-slash\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.002076 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-run-ovn-kubernetes\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.002115 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-cni-bin\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.002152 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-cni-netd\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.002790 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-ovn\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.002903 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b7gh6\" (UniqueName: \"kubernetes.io/projected/68dcbc21-b4ce-4285-9a4b-101724f82f33-kube-api-access-b7gh6\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.002957 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-openvswitch\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.003032 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e17ab744-68a7-4a24-8ef2-556696d752fb-env-overrides\") pod \"ovnkube-control-plane-57b78d8988-bd7p4\" (UID: \"e17ab744-68a7-4a24-8ef2-556696d752fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.003087 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/68dcbc21-b4ce-4285-9a4b-101724f82f33-cni-binary-copy\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.003137 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/68dcbc21-b4ce-4285-9a4b-101724f82f33-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.003201 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-systemd\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.003254 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e17ab744-68a7-4a24-8ef2-556696d752fb-ovnkube-config\") pod \"ovnkube-control-plane-57b78d8988-bd7p4\" (UID: \"e17ab744-68a7-4a24-8ef2-556696d752fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.003519 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-openvswitch\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.003617 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-run-ovn-kubernetes\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.004191 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/68dcbc21-b4ce-4285-9a4b-101724f82f33-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.004292 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs\") pod \"network-metrics-daemon-wlq8c\" (UID: \"94c19a90-c2c9-4236-98be-a0516dbb840b\") " pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.004363 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovnkube-config\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.004584 5116 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9e9b5059-1b3e-4067-a63d-2952cbe863af-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.010580 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-cni-netd\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.010592 5116 reconciler_common.go:299] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.006281 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-env-overrides\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.008449 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-run-netns\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.008478 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/68dcbc21-b4ce-4285-9a4b-101724f82f33-system-cni-dir\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.008517 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/68dcbc21-b4ce-4285-9a4b-101724f82f33-cnibin\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.008544 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.008569 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-var-lib-openvswitch\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.009270 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e17ab744-68a7-4a24-8ef2-556696d752fb-ovnkube-config\") pod \"ovnkube-control-plane-57b78d8988-bd7p4\" (UID: \"e17ab744-68a7-4a24-8ef2-556696d752fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.009318 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/68dcbc21-b4ce-4285-9a4b-101724f82f33-cni-binary-copy\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.009337 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-systemd\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: E0322 00:10:23.009441 5116 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.010251 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-etc-openvswitch\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.010421 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-cni-bin\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.007397 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e17ab744-68a7-4a24-8ef2-556696d752fb-env-overrides\") pod \"ovnkube-control-plane-57b78d8988-bd7p4\" (UID: \"e17ab744-68a7-4a24-8ef2-556696d752fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.005011 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bk75f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68dcbc21-b4ce-4285-9a4b-101724f82f33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6efa070ceb93cc5fc2e76eab6d9c96ac3c4f8812085d0b6eb6e3f513b5bac782\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3454e762466e22e2a893650b9781823558bc6fdfda2aa4188aff3cb819014c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/etc/whereabouts/config\\\",\\\"name\\\":\\\"whereabouts-flatfile-configmap\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bk75f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:23 crc kubenswrapper[5116]: E0322 00:10:23.011968 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs podName:94c19a90-c2c9-4236-98be-a0516dbb840b nodeName:}" failed. No retries permitted until 2026-03-22 00:10:23.511948195 +0000 UTC m=+94.534249568 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs") pod "network-metrics-daemon-wlq8c" (UID: "94c19a90-c2c9-4236-98be-a0516dbb840b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.012068 5116 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.012255 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovnkube-script-lib\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.012377 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ks6v2\" (UniqueName: \"kubernetes.io/projected/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-kube-api-access-ks6v2\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.012408 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/149b3c48-e17c-4a66-a835-d86dabf6ff13-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.012453 5116 reconciler_common.go:299] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/301e1965-1754-483d-b6cc-bfae7038bbca-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.012492 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8nspp\" (UniqueName: \"kubernetes.io/projected/a7a88189-c967-4640-879e-27665747f20c-kube-api-access-8nspp\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.012585 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pddnv\" (UniqueName: \"kubernetes.io/projected/e093be35-bb62-4843-b2e8-094545761610-kube-api-access-pddnv\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.015157 5116 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e9b5059-1b3e-4067-a63d-2952cbe863af-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.015242 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6g4lr\" (UniqueName: \"kubernetes.io/projected/f7e2c886-118e-43bb-bef1-c78134de392b-kube-api-access-6g4lr\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.015258 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/149b3c48-e17c-4a66-a835-d86dabf6ff13-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.015273 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w94wk\" (UniqueName: \"kubernetes.io/projected/01080b46-74f1-4191-8755-5152a57b3b25-kube-api-access-w94wk\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.015798 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9sq6c" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016468 5116 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/736c54fe-349c-4bb9-870a-d1c1d1c03831-tmp\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016531 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rzt4w\" (UniqueName: \"kubernetes.io/projected/a52afe44-fb37-46ed-a1f8-bf39727a3cbe-kube-api-access-rzt4w\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016547 5116 reconciler_common.go:299] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c491984c-7d4b-44aa-8c1e-d7974424fa47-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016561 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016584 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5f2bfad-70f6-4185-a3d9-81ce12720767-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016598 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9vsz9\" (UniqueName: \"kubernetes.io/projected/c491984c-7d4b-44aa-8c1e-d7974424fa47-kube-api-access-9vsz9\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016612 5116 reconciler_common.go:299] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/42a11a02-47e1-488f-b270-2679d3298b0e-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016625 5116 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-tmp\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016647 5116 reconciler_common.go:299] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ee8fbd3-1f81-4666-96da-5afc70819f1a-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016665 5116 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016678 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d4tqq\" (UniqueName: \"kubernetes.io/projected/6ee8fbd3-1f81-4666-96da-5afc70819f1a-kube-api-access-d4tqq\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016692 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7e8f42f-dc0e-424b-bb56-5ec849834888-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016709 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/567683bd-0efc-4f21-b076-e28559628404-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016721 5116 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d565531a-ff86-4608-9d19-767de01ac31b-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016737 5116 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016756 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xnxbn\" (UniqueName: \"kubernetes.io/projected/ce090a97-9ab6-4c40-a719-64ff2acd9778-kube-api-access-xnxbn\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016768 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31fa8943-81cc-4750-a0b7-0fa9ab5af883-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016782 5116 reconciler_common.go:299] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016796 5116 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-tmp\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016812 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c491984c-7d4b-44aa-8c1e-d7974424fa47-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016832 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016845 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xfp5s\" (UniqueName: \"kubernetes.io/projected/cc85e424-18b2-4924-920b-bd291a8c4b01-kube-api-access-xfp5s\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016861 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016878 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/584e1f4a-8205-47d7-8efb-3afc6017c4c9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016890 5116 reconciler_common.go:299] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7df94c10-441d-4386-93a6-6730fb7bcde0-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016902 5116 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a52afe44-fb37-46ed-a1f8-bf39727a3cbe-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016914 5116 reconciler_common.go:299] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9f71a554-e414-4bc3-96d2-674060397afe-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016930 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mfzkj\" (UniqueName: \"kubernetes.io/projected/0effdbcf-dd7d-404d-9d48-77536d665a5d-kube-api-access-mfzkj\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016943 5116 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted-pem\" (UniqueName: \"kubernetes.io/empty-dir/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-ca-trust-extracted-pem\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016955 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016971 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94a6e063-3d1a-4d44-875d-185291448c31-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016983 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dztfv\" (UniqueName: \"kubernetes.io/projected/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-kube-api-access-dztfv\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016995 5116 reconciler_common.go:299] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017008 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wj4qr\" (UniqueName: \"kubernetes.io/projected/149b3c48-e17c-4a66-a835-d86dabf6ff13-kube-api-access-wj4qr\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017024 5116 reconciler_common.go:299] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017036 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5f2bfad-70f6-4185-a3d9-81ce12720767-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017049 5116 reconciler_common.go:299] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a7a88189-c967-4640-879e-27665747f20c-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017061 5116 reconciler_common.go:299] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017081 5116 reconciler_common.go:299] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18f80adb-c1c3-49ba-8ee4-932c851d3897-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017094 5116 reconciler_common.go:299] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017106 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zsb9b\" (UniqueName: \"kubernetes.io/projected/09cfa50b-4138-4585-a53e-64dd3ab73335-kube-api-access-zsb9b\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017122 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017133 5116 reconciler_common.go:299] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/736c54fe-349c-4bb9-870a-d1c1d1c03831-client-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017145 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f65c0ac1-8bca-454d-a2e6-e35cb418beac-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017156 5116 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017195 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017208 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d7e8f42f-dc0e-424b-bb56-5ec849834888-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017224 5116 reconciler_common.go:299] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ce090a97-9ab6-4c40-a719-64ff2acd9778-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017236 5116 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9f71a554-e414-4bc3-96d2-674060397afe-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017252 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017268 5116 reconciler_common.go:299] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017282 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017296 5116 reconciler_common.go:299] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017313 5116 reconciler_common.go:299] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-client-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017325 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8nb9c\" (UniqueName: \"kubernetes.io/projected/6edfcf45-925b-4eff-b940-95b6fc0b85d4-kube-api-access-8nb9c\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017338 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m26jq\" (UniqueName: \"kubernetes.io/projected/567683bd-0efc-4f21-b076-e28559628404-kube-api-access-m26jq\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017355 5116 reconciler_common.go:299] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017368 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8pskd\" (UniqueName: \"kubernetes.io/projected/a555ff2e-0be6-46d5-897d-863bb92ae2b3-kube-api-access-8pskd\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017380 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-grwfz\" (UniqueName: \"kubernetes.io/projected/31fa8943-81cc-4750-a0b7-0fa9ab5af883-kube-api-access-grwfz\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017391 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ddlk9\" (UniqueName: \"kubernetes.io/projected/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-kube-api-access-ddlk9\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017429 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e17ab744-68a7-4a24-8ef2-556696d752fb-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57b78d8988-bd7p4\" (UID: \"e17ab744-68a7-4a24-8ef2-556696d752fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017676 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d7cps\" (UniqueName: \"kubernetes.io/projected/af41de71-79cf-4590-bbe9-9e8b848862cb-kube-api-access-d7cps\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017704 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ws8zz\" (UniqueName: \"kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-kube-api-access-ws8zz\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017726 5116 reconciler_common.go:299] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a7a88189-c967-4640-879e-27665747f20c-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017755 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/736c54fe-349c-4bb9-870a-d1c1d1c03831-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017776 5116 reconciler_common.go:299] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7df94c10-441d-4386-93a6-6730fb7bcde0-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017800 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017821 5116 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/92dfbade-90b6-4169-8c07-72cff7f2c82b-tmp-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018137 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018208 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ftwb6\" (UniqueName: \"kubernetes.io/projected/9f71a554-e414-4bc3-96d2-674060397afe-kube-api-access-ftwb6\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018230 5116 reconciler_common.go:299] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/16bdd140-dce1-464c-ab47-dd5798d1d256-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018249 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b605f283-6f2e-42da-a838-54421690f7d0-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018269 5116 reconciler_common.go:299] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/81e39f7b-62e4-4fc9-992a-6535ce127a02-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018361 5116 reconciler_common.go:299] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a208c9c2-333b-4b4a-be0d-bc32ec38a821-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018387 5116 reconciler_common.go:299] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7df94c10-441d-4386-93a6-6730fb7bcde0-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018411 5116 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018437 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qqbfk\" (UniqueName: \"kubernetes.io/projected/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-kube-api-access-qqbfk\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018457 5116 reconciler_common.go:299] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018474 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01080b46-74f1-4191-8755-5152a57b3b25-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018493 5116 reconciler_common.go:299] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/593a3561-7760-45c5-8f91-5aaef7475d0f-certs\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018516 5116 reconciler_common.go:299] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5ebfebf6-3ecd-458e-943f-bb25b52e2718-serviceca\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018534 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018555 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nmmzf\" (UniqueName: \"kubernetes.io/projected/7df94c10-441d-4386-93a6-6730fb7bcde0-kube-api-access-nmmzf\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018575 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qgrkj\" (UniqueName: \"kubernetes.io/projected/42a11a02-47e1-488f-b270-2679d3298b0e-kube-api-access-qgrkj\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018599 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018617 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-99zj9\" (UniqueName: \"kubernetes.io/projected/d565531a-ff86-4608-9d19-767de01ac31b-kube-api-access-99zj9\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018637 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4g8ts\" (UniqueName: \"kubernetes.io/projected/92dfbade-90b6-4169-8c07-72cff7f2c82b-kube-api-access-4g8ts\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018659 5116 reconciler_common.go:299] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018682 5116 reconciler_common.go:299] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018699 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-twvbl\" (UniqueName: \"kubernetes.io/projected/b4750666-1362-4001-abd0-6f89964cc621-kube-api-access-twvbl\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018719 5116 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92dfbade-90b6-4169-8c07-72cff7f2c82b-config-volume\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018743 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sbc2l\" (UniqueName: \"kubernetes.io/projected/593a3561-7760-45c5-8f91-5aaef7475d0f-kube-api-access-sbc2l\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018762 5116 reconciler_common.go:299] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018780 5116 reconciler_common.go:299] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0dd0fbac-8c0d-4228-8faa-abbeedabf7db-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018798 5116 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f7e2c886-118e-43bb-bef1-c78134de392b-tmp-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018821 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018839 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94a6e063-3d1a-4d44-875d-185291448c31-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018856 5116 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d7e8f42f-dc0e-424b-bb56-5ec849834888-service-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018875 5116 reconciler_common.go:299] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018901 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7afa918d-be67-40a6-803c-d3b0ae99d815-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018919 5116 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018949 5116 reconciler_common.go:299] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018968 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5lcfw\" (UniqueName: \"kubernetes.io/projected/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-kube-api-access-5lcfw\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018990 5116 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.019012 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.019032 5116 reconciler_common.go:299] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.019056 5116 reconciler_common.go:299] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.019075 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hckvg\" (UniqueName: \"kubernetes.io/projected/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-kube-api-access-hckvg\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.019094 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2325ffef-9d5b-447f-b00e-3efc429acefe-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.019113 5116 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9e9b5059-1b3e-4067-a63d-2952cbe863af-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.019135 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m5lgh\" (UniqueName: \"kubernetes.io/projected/d19cb085-0c5b-4810-b654-ce7923221d90-kube-api-access-m5lgh\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.019155 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zth6t\" (UniqueName: \"kubernetes.io/projected/6077b63e-53a2-4f96-9d56-1ce0324e4913-kube-api-access-zth6t\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.019197 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.019215 5116 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.019238 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pllx6\" (UniqueName: \"kubernetes.io/projected/81e39f7b-62e4-4fc9-992a-6535ce127a02-kube-api-access-pllx6\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.019256 5116 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.019275 5116 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/567683bd-0efc-4f21-b076-e28559628404-tmp-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.021079 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31fa8943-81cc-4750-a0b7-0fa9ab5af883-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.021561 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1811891e-33d0-4500-a481-0e4aa2d3e95c-serviceca\") pod \"node-ca-2rwjp\" (UID: \"1811891e-33d0-4500-a481-0e4aa2d3e95c\") " pod="openshift-image-registry/node-ca-2rwjp" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.023188 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc85e424-18b2-4924-920b-bd291a8c4b01-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.025283 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovn-node-metrics-cert\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.026926 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qp9r\" (UniqueName: \"kubernetes.io/projected/ec484e57-1508-45a3-99a3-51dfa8ef6195-kube-api-access-8qp9r\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029536 5116 reconciler_common.go:299] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d565531a-ff86-4608-9d19-767de01ac31b-images\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029569 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zg8nc\" (UniqueName: \"kubernetes.io/projected/2325ffef-9d5b-447f-b00e-3efc429acefe-kube-api-access-zg8nc\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029587 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7599e0b6-bddf-4def-b7f2-0b32206e8651-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029600 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f65c0ac1-8bca-454d-a2e6-e35cb418beac-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029611 5116 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f65c0ac1-8bca-454d-a2e6-e35cb418beac-tmp-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029652 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029663 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/584e1f4a-8205-47d7-8efb-3afc6017c4c9-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029675 5116 reconciler_common.go:299] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/567683bd-0efc-4f21-b076-e28559628404-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029686 5116 reconciler_common.go:299] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029697 5116 reconciler_common.go:299] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6077b63e-53a2-4f96-9d56-1ce0324e4913-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029708 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7599e0b6-bddf-4def-b7f2-0b32206e8651-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029720 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a555ff2e-0be6-46d5-897d-863bb92ae2b3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029833 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q4smf\" (UniqueName: \"kubernetes.io/projected/0dd0fbac-8c0d-4228-8faa-abbeedabf7db-kube-api-access-q4smf\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029845 5116 reconciler_common.go:299] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ce090a97-9ab6-4c40-a719-64ff2acd9778-signing-key\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029855 5116 reconciler_common.go:299] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029866 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4hb7m\" (UniqueName: \"kubernetes.io/projected/94a6e063-3d1a-4d44-875d-185291448c31-kube-api-access-4hb7m\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029877 5116 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2325ffef-9d5b-447f-b00e-3efc429acefe-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029888 5116 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6077b63e-53a2-4f96-9d56-1ce0324e4913-tmp-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029899 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029912 5116 reconciler_common.go:299] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b4750666-1362-4001-abd0-6f89964cc621-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029926 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5f2bfad-70f6-4185-a3d9-81ce12720767-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029936 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xxfcv\" (UniqueName: \"kubernetes.io/projected/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-kube-api-access-xxfcv\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029945 5116 reconciler_common.go:299] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a7a88189-c967-4640-879e-27665747f20c-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029954 5116 reconciler_common.go:299] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/301e1965-1754-483d-b6cc-bfae7038bbca-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029964 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9z4sw\" (UniqueName: \"kubernetes.io/projected/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-kube-api-access-9z4sw\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029974 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029984 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l87hs\" (UniqueName: \"kubernetes.io/projected/5ebfebf6-3ecd-458e-943f-bb25b52e2718-kube-api-access-l87hs\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029995 5116 reconciler_common.go:299] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030005 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hm9x7\" (UniqueName: \"kubernetes.io/projected/f559dfa3-3917-43a2-97f6-61ddfda10e93-kube-api-access-hm9x7\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030015 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc85e424-18b2-4924-920b-bd291a8c4b01-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030028 5116 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-service-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030041 5116 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4750666-1362-4001-abd0-6f89964cc621-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030051 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-94l9h\" (UniqueName: \"kubernetes.io/projected/16bdd140-dce1-464c-ab47-dd5798d1d256-kube-api-access-94l9h\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030062 5116 reconciler_common.go:299] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030098 5116 reconciler_common.go:299] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-audit\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030127 5116 reconciler_common.go:299] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030158 5116 reconciler_common.go:299] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030207 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tknt7\" (UniqueName: \"kubernetes.io/projected/584e1f4a-8205-47d7-8efb-3afc6017c4c9-kube-api-access-tknt7\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030217 5116 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030228 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mjwtd\" (UniqueName: \"kubernetes.io/projected/869851b9-7ffb-4af0-b166-1d8aa40a5f80-kube-api-access-mjwtd\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030238 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2325ffef-9d5b-447f-b00e-3efc429acefe-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030356 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pgx6b\" (UniqueName: \"kubernetes.io/projected/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4-kube-api-access-pgx6b\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030371 5116 reconciler_common.go:299] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030384 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wbmqg\" (UniqueName: \"kubernetes.io/projected/18f80adb-c1c3-49ba-8ee4-932c851d3897-kube-api-access-wbmqg\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030398 5116 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030409 5116 reconciler_common.go:299] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/593a3561-7760-45c5-8f91-5aaef7475d0f-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030420 5116 reconciler_common.go:299] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030431 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/736c54fe-349c-4bb9-870a-d1c1d1c03831-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030443 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01080b46-74f1-4191-8755-5152a57b3b25-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030454 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z5rsr\" (UniqueName: \"kubernetes.io/projected/af33e427-6803-48c2-a76a-dd9deb7cbf9a-kube-api-access-z5rsr\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030466 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-26xrl\" (UniqueName: \"kubernetes.io/projected/a208c9c2-333b-4b4a-be0d-bc32ec38a821-kube-api-access-26xrl\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030481 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030579 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058dd31f-b3ad-4ab1-a174-760d8eb305f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://52601b921664be4e939b691ab6a7a52e7865f01815701526f277de4bf21520e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://cf2bdb501428b5446a15124d06a58c58c7cfb52a6b41e975ff1b3c08894e0cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf2bdb501428b5446a15124d06a58c58c7cfb52a6b41e975ff1b3c08894e0cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:50Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:08:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030767 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzzhb\" (UniqueName: \"kubernetes.io/projected/94c19a90-c2c9-4236-98be-a0516dbb840b-kube-api-access-lzzhb\") pod \"network-metrics-daemon-wlq8c\" (UID: \"94c19a90-c2c9-4236-98be-a0516dbb840b\") " pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030834 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030969 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5jnd7" event={"ID":"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc","Type":"ContainerStarted","Data":"269e063a36b7b7fc11164fcc08c4e2be795a7587abb86cc3cd3059814b1428ed"} Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.031089 5116 reconciler_common.go:299] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.031105 5116 reconciler_common.go:299] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d565531a-ff86-4608-9d19-767de01ac31b-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.031116 5116 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-tmp\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.031126 5116 reconciler_common.go:299] "Volume detached for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-whereabouts-flatfile-configmap\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.031137 5116 reconciler_common.go:299] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.031150 5116 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9e9b5059-1b3e-4067-a63d-2952cbe863af-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.031181 5116 reconciler_common.go:299] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.031194 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.031203 5116 reconciler_common.go:299] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.031214 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09cfa50b-4138-4585-a53e-64dd3ab73335-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.031223 5116 reconciler_common.go:299] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/92dfbade-90b6-4169-8c07-72cff7f2c82b-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.031234 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l9stx\" (UniqueName: \"kubernetes.io/projected/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-kube-api-access-l9stx\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.032251 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" event={"ID":"34177974-8d82-49d2-a763-391d0df3bbd8","Type":"ContainerStarted","Data":"e2e86fbd43894a1b6caf7d25e4e7fdfc49ae025f450dc0712baf376e54c613b2"} Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.033746 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntdv4\" (UniqueName: \"kubernetes.io/projected/e17ab744-68a7-4a24-8ef2-556696d752fb-kube-api-access-ntdv4\") pod \"ovnkube-control-plane-57b78d8988-bd7p4\" (UID: \"e17ab744-68a7-4a24-8ef2-556696d752fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.037649 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" event={"ID":"fc4541ce-7789-4670-bc75-5c2868e52ce0","Type":"ContainerStarted","Data":"f01464e4e01b07766168893ba8966ef163e5529f1b9bf0bd071a9fcee59ea506"} Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.037932 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qscfn\" (UniqueName: \"kubernetes.io/projected/1811891e-33d0-4500-a481-0e4aa2d3e95c-kube-api-access-qscfn\") pod \"node-ca-2rwjp\" (UID: \"1811891e-33d0-4500-a481-0e4aa2d3e95c\") " pod="openshift-image-registry/node-ca-2rwjp" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.041426 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.041576 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwnjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d36c245b-3d7f-48eb-848e-c54198ae38a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwnjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.041629 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.041992 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.042306 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.042354 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7gh6\" (UniqueName: \"kubernetes.io/projected/68dcbc21-b4ce-4285-9a4b-101724f82f33-kube-api-access-b7gh6\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.042636 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:23Z","lastTransitionTime":"2026-03-22T00:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.055528 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0911025-04f5-4040-a72c-14769d03d8e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://2d70ad1aa005302b2a06b86cd4b8f4df7506345c8e11f68b51f077be50023120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:51Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://05d6ab41bff1491b16d13fd321fc3f2ed79784b5945966cfa37735f570281b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:50Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://497485a3b5940835bde037d81157c05ecaaa45b09bdffff76bbc038694f328d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:51Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://dac49ea656d097873f0dcd29b532de929dd705148709a0184938669a24b2d0b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:51Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:08:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.055853 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.068541 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34177974-8d82-49d2-a763-391d0df3bbd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-7bdcf4f5bd-7fjxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.073024 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.078913 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2rwjp" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.087568 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f863fff9-286a-45fa-b8f0-8a86994b8440\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7w75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-5bb8f5cd97-xdvz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.096613 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.099642 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fbdfe828b092b23e6d4480daf3e0216aada6debaf1ef1b314a0a31e73ebf13c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-5ff7774fd9-nljh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.108789 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wlq8c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94c19a90-c2c9-4236-98be-a0516dbb840b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzzhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:49b34ce0d25eec7a6077f4bf21bf7d4e64e598d28785a20b9ee3594423b7de14\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzzhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wlq8c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:23 crc kubenswrapper[5116]: W0322 00:10:23.120030 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1811891e_33d0_4500_a481_0e4aa2d3e95c.slice/crio-16c3a69ba39325497a09a22f1ec9fc81921bd2fcfc5191e8acd8461faf4f32d5 WatchSource:0}: Error finding container 16c3a69ba39325497a09a22f1ec9fc81921bd2fcfc5191e8acd8461faf4f32d5: Status 404 returned error can't find the container with id 16c3a69ba39325497a09a22f1ec9fc81921bd2fcfc5191e8acd8461faf4f32d5 Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.123825 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:23 crc kubenswrapper[5116]: W0322 00:10:23.141405 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec484e57_1508_45a3_99a3_51dfa8ef6195.slice/crio-43ef6e49a2566cb4645e9989226efbcb7eed21688ac9c6361a1e719ed88b50a9 WatchSource:0}: Error finding container 43ef6e49a2566cb4645e9989226efbcb7eed21688ac9c6361a1e719ed88b50a9: Status 404 returned error can't find the container with id 43ef6e49a2566cb4645e9989226efbcb7eed21688ac9c6361a1e719ed88b50a9 Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.147227 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.147280 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.147292 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.147311 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.147322 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:23Z","lastTransitionTime":"2026-03-22T00:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.252138 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.252251 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.252269 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.252286 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.252301 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:23Z","lastTransitionTime":"2026-03-22T00:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.334333 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7w75\" (UniqueName: \"kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75\") pod \"network-check-source-5bb8f5cd97-xdvz5\" (UID: \"f863fff9-286a-45fa-b8f0-8a86994b8440\") " pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.334377 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.334408 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.334451 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwt8b\" (UniqueName: \"kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b\") pod \"network-check-target-fhkjl\" (UID: \"17b87002-b798-480a-8e17-83053d698239\") " pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:23 crc kubenswrapper[5116]: E0322 00:10:23.334497 5116 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 22 00:10:23 crc kubenswrapper[5116]: E0322 00:10:23.334553 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:24.334538441 +0000 UTC m=+95.356839814 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 22 00:10:23 crc kubenswrapper[5116]: E0322 00:10:23.334871 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 22 00:10:23 crc kubenswrapper[5116]: E0322 00:10:23.334893 5116 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 22 00:10:23 crc kubenswrapper[5116]: E0322 00:10:23.334924 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:24.334916563 +0000 UTC m=+95.357217926 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 22 00:10:23 crc kubenswrapper[5116]: E0322 00:10:23.334894 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 22 00:10:23 crc kubenswrapper[5116]: E0322 00:10:23.334940 5116 projected.go:194] Error preparing data for projected volume kube-api-access-gwt8b for pod openshift-network-diagnostics/network-check-target-fhkjl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:23 crc kubenswrapper[5116]: E0322 00:10:23.334960 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b podName:17b87002-b798-480a-8e17-83053d698239 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:24.334954495 +0000 UTC m=+95.357255868 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-gwt8b" (UniqueName: "kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b") pod "network-check-target-fhkjl" (UID: "17b87002-b798-480a-8e17-83053d698239") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:23 crc kubenswrapper[5116]: E0322 00:10:23.335525 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 22 00:10:23 crc kubenswrapper[5116]: E0322 00:10:23.335560 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 22 00:10:23 crc kubenswrapper[5116]: E0322 00:10:23.335577 5116 projected.go:194] Error preparing data for projected volume kube-api-access-l7w75 for pod openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:23 crc kubenswrapper[5116]: E0322 00:10:23.335648 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75 podName:f863fff9-286a-45fa-b8f0-8a86994b8440 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:24.335628266 +0000 UTC m=+95.357929709 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-l7w75" (UniqueName: "kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75") pod "network-check-source-5bb8f5cd97-xdvz5" (UID: "f863fff9-286a-45fa-b8f0-8a86994b8440") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.354898 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.354933 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.354941 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.354956 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.354967 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:23Z","lastTransitionTime":"2026-03-22T00:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.434910 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:23 crc kubenswrapper[5116]: E0322 00:10:23.435081 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:24.435054711 +0000 UTC m=+95.457356084 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.457982 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.458034 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.458046 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.458063 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.458076 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:23Z","lastTransitionTime":"2026-03-22T00:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.535919 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs\") pod \"network-metrics-daemon-wlq8c\" (UID: \"94c19a90-c2c9-4236-98be-a0516dbb840b\") " pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:23 crc kubenswrapper[5116]: E0322 00:10:23.536079 5116 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 22 00:10:23 crc kubenswrapper[5116]: E0322 00:10:23.536142 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs podName:94c19a90-c2c9-4236-98be-a0516dbb840b nodeName:}" failed. No retries permitted until 2026-03-22 00:10:24.536126518 +0000 UTC m=+95.558427891 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs") pod "network-metrics-daemon-wlq8c" (UID: "94c19a90-c2c9-4236-98be-a0516dbb840b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.560776 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.560810 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.560819 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.560834 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.560857 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:23Z","lastTransitionTime":"2026-03-22T00:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.663613 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.663921 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.663939 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.663959 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.663973 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:23Z","lastTransitionTime":"2026-03-22T00:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.704238 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01080b46-74f1-4191-8755-5152a57b3b25" path="/var/lib/kubelet/pods/01080b46-74f1-4191-8755-5152a57b3b25/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.705017 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09cfa50b-4138-4585-a53e-64dd3ab73335" path="/var/lib/kubelet/pods/09cfa50b-4138-4585-a53e-64dd3ab73335/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.725821 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dd0fbac-8c0d-4228-8faa-abbeedabf7db" path="/var/lib/kubelet/pods/0dd0fbac-8c0d-4228-8faa-abbeedabf7db/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.733639 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0effdbcf-dd7d-404d-9d48-77536d665a5d" path="/var/lib/kubelet/pods/0effdbcf-dd7d-404d-9d48-77536d665a5d/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.752683 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="149b3c48-e17c-4a66-a835-d86dabf6ff13" path="/var/lib/kubelet/pods/149b3c48-e17c-4a66-a835-d86dabf6ff13/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.764469 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16bdd140-dce1-464c-ab47-dd5798d1d256" path="/var/lib/kubelet/pods/16bdd140-dce1-464c-ab47-dd5798d1d256/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.765959 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18f80adb-c1c3-49ba-8ee4-932c851d3897" path="/var/lib/kubelet/pods/18f80adb-c1c3-49ba-8ee4-932c851d3897/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.767499 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.767545 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.767558 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.767574 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.767601 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:23Z","lastTransitionTime":"2026-03-22T00:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.773596 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" path="/var/lib/kubelet/pods/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.774591 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2325ffef-9d5b-447f-b00e-3efc429acefe" path="/var/lib/kubelet/pods/2325ffef-9d5b-447f-b00e-3efc429acefe/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.778377 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="301e1965-1754-483d-b6cc-bfae7038bbca" path="/var/lib/kubelet/pods/301e1965-1754-483d-b6cc-bfae7038bbca/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.781094 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31fa8943-81cc-4750-a0b7-0fa9ab5af883" path="/var/lib/kubelet/pods/31fa8943-81cc-4750-a0b7-0fa9ab5af883/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.787735 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42a11a02-47e1-488f-b270-2679d3298b0e" path="/var/lib/kubelet/pods/42a11a02-47e1-488f-b270-2679d3298b0e/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.788841 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="567683bd-0efc-4f21-b076-e28559628404" path="/var/lib/kubelet/pods/567683bd-0efc-4f21-b076-e28559628404/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.809466 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="584e1f4a-8205-47d7-8efb-3afc6017c4c9" path="/var/lib/kubelet/pods/584e1f4a-8205-47d7-8efb-3afc6017c4c9/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.810053 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="593a3561-7760-45c5-8f91-5aaef7475d0f" path="/var/lib/kubelet/pods/593a3561-7760-45c5-8f91-5aaef7475d0f/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.813696 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ebfebf6-3ecd-458e-943f-bb25b52e2718" path="/var/lib/kubelet/pods/5ebfebf6-3ecd-458e-943f-bb25b52e2718/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.815207 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6077b63e-53a2-4f96-9d56-1ce0324e4913" path="/var/lib/kubelet/pods/6077b63e-53a2-4f96-9d56-1ce0324e4913/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.817467 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" path="/var/lib/kubelet/pods/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.825917 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6edfcf45-925b-4eff-b940-95b6fc0b85d4" path="/var/lib/kubelet/pods/6edfcf45-925b-4eff-b940-95b6fc0b85d4/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.837672 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ee8fbd3-1f81-4666-96da-5afc70819f1a" path="/var/lib/kubelet/pods/6ee8fbd3-1f81-4666-96da-5afc70819f1a/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.841502 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a" path="/var/lib/kubelet/pods/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.851825 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="736c54fe-349c-4bb9-870a-d1c1d1c03831" path="/var/lib/kubelet/pods/736c54fe-349c-4bb9-870a-d1c1d1c03831/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.854745 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7599e0b6-bddf-4def-b7f2-0b32206e8651" path="/var/lib/kubelet/pods/7599e0b6-bddf-4def-b7f2-0b32206e8651/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.862699 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7afa918d-be67-40a6-803c-d3b0ae99d815" path="/var/lib/kubelet/pods/7afa918d-be67-40a6-803c-d3b0ae99d815/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.864268 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7df94c10-441d-4386-93a6-6730fb7bcde0" path="/var/lib/kubelet/pods/7df94c10-441d-4386-93a6-6730fb7bcde0/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.867680 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" path="/var/lib/kubelet/pods/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.871781 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.871858 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.871875 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.871905 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.871926 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:23Z","lastTransitionTime":"2026-03-22T00:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.879527 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81e39f7b-62e4-4fc9-992a-6535ce127a02" path="/var/lib/kubelet/pods/81e39f7b-62e4-4fc9-992a-6535ce127a02/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.880684 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="869851b9-7ffb-4af0-b166-1d8aa40a5f80" path="/var/lib/kubelet/pods/869851b9-7ffb-4af0-b166-1d8aa40a5f80/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.888778 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff" path="/var/lib/kubelet/pods/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.889493 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92dfbade-90b6-4169-8c07-72cff7f2c82b" path="/var/lib/kubelet/pods/92dfbade-90b6-4169-8c07-72cff7f2c82b/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.895196 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94a6e063-3d1a-4d44-875d-185291448c31" path="/var/lib/kubelet/pods/94a6e063-3d1a-4d44-875d-185291448c31/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.897613 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f71a554-e414-4bc3-96d2-674060397afe" path="/var/lib/kubelet/pods/9f71a554-e414-4bc3-96d2-674060397afe/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.909538 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a208c9c2-333b-4b4a-be0d-bc32ec38a821" path="/var/lib/kubelet/pods/a208c9c2-333b-4b4a-be0d-bc32ec38a821/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.912495 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a52afe44-fb37-46ed-a1f8-bf39727a3cbe" path="/var/lib/kubelet/pods/a52afe44-fb37-46ed-a1f8-bf39727a3cbe/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.914473 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a555ff2e-0be6-46d5-897d-863bb92ae2b3" path="/var/lib/kubelet/pods/a555ff2e-0be6-46d5-897d-863bb92ae2b3/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.915259 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7a88189-c967-4640-879e-27665747f20c" path="/var/lib/kubelet/pods/a7a88189-c967-4640-879e-27665747f20c/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.919263 5116 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="af33e427-6803-48c2-a76a-dd9deb7cbf9a" path="/var/lib/kubelet/pods/af33e427-6803-48c2-a76a-dd9deb7cbf9a/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.919527 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af33e427-6803-48c2-a76a-dd9deb7cbf9a" path="/var/lib/kubelet/pods/af33e427-6803-48c2-a76a-dd9deb7cbf9a/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.927549 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af41de71-79cf-4590-bbe9-9e8b848862cb" path="/var/lib/kubelet/pods/af41de71-79cf-4590-bbe9-9e8b848862cb/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.931896 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" path="/var/lib/kubelet/pods/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.938109 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4750666-1362-4001-abd0-6f89964cc621" path="/var/lib/kubelet/pods/b4750666-1362-4001-abd0-6f89964cc621/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.943843 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b605f283-6f2e-42da-a838-54421690f7d0" path="/var/lib/kubelet/pods/b605f283-6f2e-42da-a838-54421690f7d0/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.944432 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c491984c-7d4b-44aa-8c1e-d7974424fa47" path="/var/lib/kubelet/pods/c491984c-7d4b-44aa-8c1e-d7974424fa47/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.946056 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5f2bfad-70f6-4185-a3d9-81ce12720767" path="/var/lib/kubelet/pods/c5f2bfad-70f6-4185-a3d9-81ce12720767/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.946973 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc85e424-18b2-4924-920b-bd291a8c4b01" path="/var/lib/kubelet/pods/cc85e424-18b2-4924-920b-bd291a8c4b01/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.947504 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce090a97-9ab6-4c40-a719-64ff2acd9778" path="/var/lib/kubelet/pods/ce090a97-9ab6-4c40-a719-64ff2acd9778/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.949145 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d19cb085-0c5b-4810-b654-ce7923221d90" path="/var/lib/kubelet/pods/d19cb085-0c5b-4810-b654-ce7923221d90/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.951161 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" path="/var/lib/kubelet/pods/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.953008 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d565531a-ff86-4608-9d19-767de01ac31b" path="/var/lib/kubelet/pods/d565531a-ff86-4608-9d19-767de01ac31b/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.954415 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7e8f42f-dc0e-424b-bb56-5ec849834888" path="/var/lib/kubelet/pods/d7e8f42f-dc0e-424b-bb56-5ec849834888/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.957353 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" path="/var/lib/kubelet/pods/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.961420 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e093be35-bb62-4843-b2e8-094545761610" path="/var/lib/kubelet/pods/e093be35-bb62-4843-b2e8-094545761610/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.962231 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1d2a42d-af1d-4054-9618-ab545e0ed8b7" path="/var/lib/kubelet/pods/e1d2a42d-af1d-4054-9618-ab545e0ed8b7/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.963739 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f559dfa3-3917-43a2-97f6-61ddfda10e93" path="/var/lib/kubelet/pods/f559dfa3-3917-43a2-97f6-61ddfda10e93/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.971139 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f65c0ac1-8bca-454d-a2e6-e35cb418beac" path="/var/lib/kubelet/pods/f65c0ac1-8bca-454d-a2e6-e35cb418beac/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.972503 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4" path="/var/lib/kubelet/pods/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.974427 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7e2c886-118e-43bb-bef1-c78134de392b" path="/var/lib/kubelet/pods/f7e2c886-118e-43bb-bef1-c78134de392b/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.975687 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.975739 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.975751 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.975768 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.975779 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:23Z","lastTransitionTime":"2026-03-22T00:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.976792 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc8db2c7-859d-47b3-a900-2bd0c0b2973b" path="/var/lib/kubelet/pods/fc8db2c7-859d-47b3-a900-2bd0c0b2973b/volumes" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.043372 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" event={"ID":"e17ab744-68a7-4a24-8ef2-556696d752fb","Type":"ContainerStarted","Data":"ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.043430 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" event={"ID":"e17ab744-68a7-4a24-8ef2-556696d752fb","Type":"ContainerStarted","Data":"6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.043448 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" event={"ID":"e17ab744-68a7-4a24-8ef2-556696d752fb","Type":"ContainerStarted","Data":"7eadfb4600290cb56b95da12f03d4c885e0344117c4889ec529ea4aaac7dd7ce"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.047500 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" event={"ID":"fc4541ce-7789-4670-bc75-5c2868e52ce0","Type":"ContainerStarted","Data":"4f407ddd10876e1a8c14d096fd93218ee2e273d2e13b40b7f47ac97e4f7577c8"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.047552 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" event={"ID":"fc4541ce-7789-4670-bc75-5c2868e52ce0","Type":"ContainerStarted","Data":"57fbcaaf231fb8acf6f6165c6611af0cf0d6cf135ea43f7a7481a2124db8ed43"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.049540 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bk75f" event={"ID":"68dcbc21-b4ce-4285-9a4b-101724f82f33","Type":"ContainerStarted","Data":"3a773f1e36416b2c124cfa148df2f80bb14dcce04409d501d212d4552fb6fdab"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.049599 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bk75f" event={"ID":"68dcbc21-b4ce-4285-9a4b-101724f82f33","Type":"ContainerStarted","Data":"a55e076fcabd345547e41146fd9e4a751e99d82dd2f925d7e04407dbcd5c1367"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.053159 5116 generic.go:358] "Generic (PLEG): container finished" podID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerID="f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375" exitCode=0 Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.053191 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerDied","Data":"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.053334 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerStarted","Data":"43ef6e49a2566cb4645e9989226efbcb7eed21688ac9c6361a1e719ed88b50a9"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.055787 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerStarted","Data":"1d1e21356eefd317bcf9ab8691392e59de36a5c78a5c0e2abe762755ab45df92"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.055821 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerStarted","Data":"a25cfaae8e3e082964edf4aaff4c07221b6f7fe72f8c4b2ecedb1fe877eab638"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.055834 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerStarted","Data":"b7f3005da76032ee4dab81b7f41c5734f96c4b93a470a0cb1ed78aa7bf231102"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.058266 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9sq6c" event={"ID":"5188f25b-37c3-46f1-b939-199c6e082848","Type":"ContainerStarted","Data":"15ed776a73c12ffc79727de77156edc740c2234810a94e17ee8fc99d259db9c0"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.058310 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9sq6c" event={"ID":"5188f25b-37c3-46f1-b939-199c6e082848","Type":"ContainerStarted","Data":"29c4f6cffa6dd6b4a39e497ff4f5ad41df8eb7d4e7d5b313b0b79363eb66c70c"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.058365 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pkhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pkhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-66g6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.060686 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-nwnjb" event={"ID":"d36c245b-3d7f-48eb-848e-c54198ae38a4","Type":"ContainerStarted","Data":"b9bc4199ef1749e4a84f489f40e0390b5b26d91adc18920dda60b81ab52cadeb"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.060749 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-nwnjb" event={"ID":"d36c245b-3d7f-48eb-848e-c54198ae38a4","Type":"ContainerStarted","Data":"535e6dea8944ee8c6d28f33eb0fb646fa6fc219a9d2f1e217232b2b43b2fcbfb"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.062747 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2rwjp" event={"ID":"1811891e-33d0-4500-a481-0e4aa2d3e95c","Type":"ContainerStarted","Data":"f51b492b0338abc8e31fd29413b4b5d43f33d541be8a08e10f8ebe50a03fb477"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.062786 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2rwjp" event={"ID":"1811891e-33d0-4500-a481-0e4aa2d3e95c","Type":"ContainerStarted","Data":"16c3a69ba39325497a09a22f1ec9fc81921bd2fcfc5191e8acd8461faf4f32d5"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.064405 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" event={"ID":"34177974-8d82-49d2-a763-391d0df3bbd8","Type":"ContainerStarted","Data":"b8e5f28015a7839919e338305c619f5b1210d5514ab180e75985ccbd323c7240"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.075092 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec484e57-1508-45a3-99a3-51dfa8ef6195\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n9zvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.079791 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.079840 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.079853 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.079873 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.079889 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:24Z","lastTransitionTime":"2026-03-22T00:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.087039 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe8361c-9ce7-48cd-9142-ae635e1b27d2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://5f2712849d12be66837cdb73e0858adfc9b9172ec8bbe1d83c7fe3fcc4bc8fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:51Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://8c2b67442f55addc55a7f281f1b6a6441e1d6068e6f826dedb686cdc29ef2ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://609c703ea4eb165b2fdc69a88571190199d2dd374ea3629e50d7e6091c4552b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://1543246d672acb399d96f28680b91e7dcf741cbb8ce21363dd09dec6dad2687c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1543246d672acb399d96f28680b91e7dcf741cbb8ce21363dd09dec6dad2687c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:50Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:08:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.115839 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b44f09af-f7e2-4bcb-bdba-55ff8b81f5de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"},\\\"containerID\\\":\\\"cri-o://5c4690216bec52133687895e582b4bb77c5e98aa64b321ad9c9d506f55d3f00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:54Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://2600f5226a6713d12505ffd457e56ebd7d088161c72b9c8559bbea8a975297de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:54Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://79bb9087b191778249a1a55111db0643e3b60d19f757f975df02d11b33c9c101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:54Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://bfe93ca3eb874b172d7e5a437f62f0a3794ec8d1de985eadc78f9c5dc076e580\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:54Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://88776b5f47aa00647e827ac8c99416f29940ce9d6589329425a92597ab26ec51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:53Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://18580e6f77485dbbf6c9384d7b066de66b1281944199ad05eb32241a77f1b8db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18580e6f77485dbbf6c9384d7b066de66b1281944199ad05eb32241a77f1b8db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:50Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd-auto-backup\\\",\\\"name\\\":\\\"etcd-auto-backup-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://b8c90b6c961bb202b828dc8ac3d4da1dd89cbe82a5400089a87594b9666234de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8c90b6c961bb202b828dc8ac3d4da1dd89cbe82a5400089a87594b9666234de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:51Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://c9fbf4e89cefa264fde65048c1a9bba1b6eafe0e57717eb8d9a23e9a445b8195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9fbf4e89cefa264fde65048c1a9bba1b6eafe0e57717eb8d9a23e9a445b8195\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:08:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.129199 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17b87002-b798-480a-8e17-83053d698239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwt8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-fhkjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.140348 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc4541ce-7789-4670-bc75-5c2868e52ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-dgvkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.150048 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/multus-9sq6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5188f25b-37c3-46f1-b939-199c6e082848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9sq6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.159331 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e17ab744-68a7-4a24-8ef2-556696d752fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"20Mi\\\"},\\\"containerID\\\":\\\"cri-o://6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"20Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:10:23Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntdv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"300Mi\\\"},\\\"containerID\\\":\\\"cri-o://ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"300Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:10:23Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntdv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-57b78d8988-bd7p4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.169549 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2rwjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1811891e-33d0-4500-a481-0e4aa2d3e95c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qscfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2rwjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.183114 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.183186 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.183202 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.183222 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.183235 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:24Z","lastTransitionTime":"2026-03-22T00:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.184834 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bk75f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68dcbc21-b4ce-4285-9a4b-101724f82f33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6efa070ceb93cc5fc2e76eab6d9c96ac3c4f8812085d0b6eb6e3f513b5bac782\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3454e762466e22e2a893650b9781823558bc6fdfda2aa4188aff3cb819014c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/etc/whereabouts/config\\\",\\\"name\\\":\\\"whereabouts-flatfile-configmap\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bk75f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.195969 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058dd31f-b3ad-4ab1-a174-760d8eb305f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://52601b921664be4e939b691ab6a7a52e7865f01815701526f277de4bf21520e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://cf2bdb501428b5446a15124d06a58c58c7cfb52a6b41e975ff1b3c08894e0cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf2bdb501428b5446a15124d06a58c58c7cfb52a6b41e975ff1b3c08894e0cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:50Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:08:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.207647 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwnjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d36c245b-3d7f-48eb-848e-c54198ae38a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwnjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.222871 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0911025-04f5-4040-a72c-14769d03d8e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://2d70ad1aa005302b2a06b86cd4b8f4df7506345c8e11f68b51f077be50023120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:51Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://05d6ab41bff1491b16d13fd321fc3f2ed79784b5945966cfa37735f570281b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:50Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://497485a3b5940835bde037d81157c05ecaaa45b09bdffff76bbc038694f328d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:51Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://dac49ea656d097873f0dcd29b532de929dd705148709a0184938669a24b2d0b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:51Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:08:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.236044 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34177974-8d82-49d2-a763-391d0df3bbd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-7bdcf4f5bd-7fjxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.248992 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f863fff9-286a-45fa-b8f0-8a86994b8440\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7w75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-5bb8f5cd97-xdvz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.261738 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fbdfe828b092b23e6d4480daf3e0216aada6debaf1ef1b314a0a31e73ebf13c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-5ff7774fd9-nljh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.270880 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wlq8c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94c19a90-c2c9-4236-98be-a0516dbb840b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzzhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:49b34ce0d25eec7a6077f4bf21bf7d4e64e598d28785a20b9ee3594423b7de14\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzzhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wlq8c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.283026 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bbc05d-2ba5-48f3-8d94-3fcd0c0f12d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"},\\\"containerID\\\":\\\"cri-o://ad6827aa53ec071d573f2851cceb4fb83950ed3acf710fdb8da0db4f5143d5e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-bundle-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://8b5eb8790cbf9c748b6973a9f5ce75637e41f035ed3bdd5eda970498f8d57bdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://d4b3378494f7debf7a1df1eb70ba9f5b687fd897b206a12e1eb127da23e5830b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://4ec1f0e4053fa1e136a94ad86e588cc0fd43b29333734b120fe3d6175c1913a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ec1f0e4053fa1e136a94ad86e588cc0fd43b29333734b120fe3d6175c1913a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-22T00:09:59Z\\\",\\\"message\\\":\\\"ar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"ClientsPreferCBOR\\\\\\\" enabled=false\\\\nW0322 00:09:59.459991 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0322 00:09:59.460237 1 builder.go:304] check-endpoints version v0.0.0-unknown-c3d9642-c3d9642\\\\nI0322 00:09:59.461209 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2974540120/tls.crt::/tmp/serving-cert-2974540120/tls.key\\\\\\\"\\\\nI0322 00:09:59.961022 1 requestheader_controller.go:255] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0322 00:09:59.962668 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0322 00:09:59.962684 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0322 00:09:59.962706 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0322 00:09:59.962712 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0322 00:09:59.967903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0322 00:09:59.967930 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0322 00:09:59.967936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0322 00:09:59.967945 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0322 00:09:59.967951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0322 00:09:59.967956 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0322 00:09:59.967963 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0322 00:09:59.967953 1 genericapiserver.go:546] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0322 00:09:59.969056 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-22T00:09:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://b24be2af5c7a78bb1d324b802332ff3620ee459e00164b6574221c5186689456\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://15a3fa2ea4a685791c1b819448d6d0952ea97b7caf1a51b4250746e17e743cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15a3fa2ea4a685791c1b819448d6d0952ea97b7caf1a51b4250746e17e743cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:50Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:08:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.285611 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.285649 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.285664 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.285681 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.285694 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:24Z","lastTransitionTime":"2026-03-22T00:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.298267 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-5jnd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dsgwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-5jnd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.307480 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wlq8c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94c19a90-c2c9-4236-98be-a0516dbb840b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzzhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:49b34ce0d25eec7a6077f4bf21bf7d4e64e598d28785a20b9ee3594423b7de14\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzzhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wlq8c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.321807 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bbc05d-2ba5-48f3-8d94-3fcd0c0f12d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"},\\\"containerID\\\":\\\"cri-o://ad6827aa53ec071d573f2851cceb4fb83950ed3acf710fdb8da0db4f5143d5e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-bundle-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://8b5eb8790cbf9c748b6973a9f5ce75637e41f035ed3bdd5eda970498f8d57bdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://d4b3378494f7debf7a1df1eb70ba9f5b687fd897b206a12e1eb127da23e5830b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://4ec1f0e4053fa1e136a94ad86e588cc0fd43b29333734b120fe3d6175c1913a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ec1f0e4053fa1e136a94ad86e588cc0fd43b29333734b120fe3d6175c1913a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-22T00:09:59Z\\\",\\\"message\\\":\\\"ar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"ClientsPreferCBOR\\\\\\\" enabled=false\\\\nW0322 00:09:59.459991 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0322 00:09:59.460237 1 builder.go:304] check-endpoints version v0.0.0-unknown-c3d9642-c3d9642\\\\nI0322 00:09:59.461209 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2974540120/tls.crt::/tmp/serving-cert-2974540120/tls.key\\\\\\\"\\\\nI0322 00:09:59.961022 1 requestheader_controller.go:255] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0322 00:09:59.962668 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0322 00:09:59.962684 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0322 00:09:59.962706 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0322 00:09:59.962712 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0322 00:09:59.967903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0322 00:09:59.967930 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0322 00:09:59.967936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0322 00:09:59.967945 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0322 00:09:59.967951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0322 00:09:59.967956 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0322 00:09:59.967963 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0322 00:09:59.967953 1 genericapiserver.go:546] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0322 00:09:59.969056 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-22T00:09:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://b24be2af5c7a78bb1d324b802332ff3620ee459e00164b6574221c5186689456\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://15a3fa2ea4a685791c1b819448d6d0952ea97b7caf1a51b4250746e17e743cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15a3fa2ea4a685791c1b819448d6d0952ea97b7caf1a51b4250746e17e743cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:50Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:08:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.334763 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-5jnd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dsgwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-5jnd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.344664 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7w75\" (UniqueName: \"kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75\") pod \"network-check-source-5bb8f5cd97-xdvz5\" (UID: \"f863fff9-286a-45fa-b8f0-8a86994b8440\") " pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.344712 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.344738 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.344769 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwt8b\" (UniqueName: \"kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b\") pod \"network-check-target-fhkjl\" (UID: \"17b87002-b798-480a-8e17-83053d698239\") " pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.344922 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.344945 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.344970 5116 projected.go:194] Error preparing data for projected volume kube-api-access-gwt8b for pod openshift-network-diagnostics/network-check-target-fhkjl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.345032 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b podName:17b87002-b798-480a-8e17-83053d698239 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:26.345018675 +0000 UTC m=+97.367320048 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-gwt8b" (UniqueName: "kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b") pod "network-check-target-fhkjl" (UID: "17b87002-b798-480a-8e17-83053d698239") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.345388 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.345401 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.345409 5116 projected.go:194] Error preparing data for projected volume kube-api-access-l7w75 for pod openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.345434 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75 podName:f863fff9-286a-45fa-b8f0-8a86994b8440 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:26.345426768 +0000 UTC m=+97.367728141 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-l7w75" (UniqueName: "kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75") pod "network-check-source-5bb8f5cd97-xdvz5" (UID: "f863fff9-286a-45fa-b8f0-8a86994b8440") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.345473 5116 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.345493 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:26.34548788 +0000 UTC m=+97.367789253 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.345560 5116 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.345590 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:26.345582633 +0000 UTC m=+97.367884006 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.349023 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://1d1e21356eefd317bcf9ab8691392e59de36a5c78a5c0e2abe762755ab45df92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:10:23Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pkhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://a25cfaae8e3e082964edf4aaff4c07221b6f7fe72f8c4b2ecedb1fe877eab638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:10:23Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pkhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-66g6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.366293 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec484e57-1508-45a3-99a3-51dfa8ef6195\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"resources\\\":{},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:10:23Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n9zvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.377190 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe8361c-9ce7-48cd-9142-ae635e1b27d2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://5f2712849d12be66837cdb73e0858adfc9b9172ec8bbe1d83c7fe3fcc4bc8fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:51Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://8c2b67442f55addc55a7f281f1b6a6441e1d6068e6f826dedb686cdc29ef2ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://609c703ea4eb165b2fdc69a88571190199d2dd374ea3629e50d7e6091c4552b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://1543246d672acb399d96f28680b91e7dcf741cbb8ce21363dd09dec6dad2687c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1543246d672acb399d96f28680b91e7dcf741cbb8ce21363dd09dec6dad2687c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:50Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:08:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.387110 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.387149 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.387160 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.387197 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.387206 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:24Z","lastTransitionTime":"2026-03-22T00:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.393657 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b44f09af-f7e2-4bcb-bdba-55ff8b81f5de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"},\\\"containerID\\\":\\\"cri-o://5c4690216bec52133687895e582b4bb77c5e98aa64b321ad9c9d506f55d3f00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:54Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://2600f5226a6713d12505ffd457e56ebd7d088161c72b9c8559bbea8a975297de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:54Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://79bb9087b191778249a1a55111db0643e3b60d19f757f975df02d11b33c9c101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:54Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://bfe93ca3eb874b172d7e5a437f62f0a3794ec8d1de985eadc78f9c5dc076e580\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:54Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://88776b5f47aa00647e827ac8c99416f29940ce9d6589329425a92597ab26ec51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:53Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://18580e6f77485dbbf6c9384d7b066de66b1281944199ad05eb32241a77f1b8db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18580e6f77485dbbf6c9384d7b066de66b1281944199ad05eb32241a77f1b8db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:50Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd-auto-backup\\\",\\\"name\\\":\\\"etcd-auto-backup-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://b8c90b6c961bb202b828dc8ac3d4da1dd89cbe82a5400089a87594b9666234de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8c90b6c961bb202b828dc8ac3d4da1dd89cbe82a5400089a87594b9666234de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:51Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://c9fbf4e89cefa264fde65048c1a9bba1b6eafe0e57717eb8d9a23e9a445b8195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9fbf4e89cefa264fde65048c1a9bba1b6eafe0e57717eb8d9a23e9a445b8195\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:08:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.404000 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17b87002-b798-480a-8e17-83053d698239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwt8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-fhkjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.414095 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc4541ce-7789-4670-bc75-5c2868e52ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://4f407ddd10876e1a8c14d096fd93218ee2e273d2e13b40b7f47ac97e4f7577c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:10:23Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0,1000500000],\\\"uid\\\":1000500000}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://57fbcaaf231fb8acf6f6165c6611af0cf0d6cf135ea43f7a7481a2124db8ed43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:10:23Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0,1000500000],\\\"uid\\\":1000500000}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-dgvkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.425671 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/multus-9sq6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5188f25b-37c3-46f1-b939-199c6e082848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"65Mi\\\"},\\\"containerID\\\":\\\"cri-o://15ed776a73c12ffc79727de77156edc740c2234810a94e17ee8fc99d259db9c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"65Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:10:23Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9sq6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.438328 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e17ab744-68a7-4a24-8ef2-556696d752fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"20Mi\\\"},\\\"containerID\\\":\\\"cri-o://6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"20Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:10:23Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntdv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"300Mi\\\"},\\\"containerID\\\":\\\"cri-o://ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"300Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:10:23Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntdv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-57b78d8988-bd7p4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.448799 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2rwjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1811891e-33d0-4500-a481-0e4aa2d3e95c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"10Mi\\\"},\\\"containerID\\\":\\\"cri-o://f51b492b0338abc8e31fd29413b4b5d43f33d541be8a08e10f8ebe50a03fb477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"10Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:10:23Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":1001}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qscfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2rwjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.449560 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.449761 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:26.449745119 +0000 UTC m=+97.472046492 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.464916 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bk75f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68dcbc21-b4ce-4285-9a4b-101724f82f33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a773f1e36416b2c124cfa148df2f80bb14dcce04409d501d212d4552fb6fdab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"resources\\\":{},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:10:23Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6efa070ceb93cc5fc2e76eab6d9c96ac3c4f8812085d0b6eb6e3f513b5bac782\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3454e762466e22e2a893650b9781823558bc6fdfda2aa4188aff3cb819014c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/etc/whereabouts/config\\\",\\\"name\\\":\\\"whereabouts-flatfile-configmap\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bk75f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.476067 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058dd31f-b3ad-4ab1-a174-760d8eb305f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://52601b921664be4e939b691ab6a7a52e7865f01815701526f277de4bf21520e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://cf2bdb501428b5446a15124d06a58c58c7cfb52a6b41e975ff1b3c08894e0cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf2bdb501428b5446a15124d06a58c58c7cfb52a6b41e975ff1b3c08894e0cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:50Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:08:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.485886 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwnjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d36c245b-3d7f-48eb-848e-c54198ae38a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"21Mi\\\"},\\\"containerID\\\":\\\"cri-o://b9bc4199ef1749e4a84f489f40e0390b5b26d91adc18920dda60b81ab52cadeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"21Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:10:23Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwnjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.489540 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.489607 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.489627 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.489680 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.489698 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:24Z","lastTransitionTime":"2026-03-22T00:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.497306 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0911025-04f5-4040-a72c-14769d03d8e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://2d70ad1aa005302b2a06b86cd4b8f4df7506345c8e11f68b51f077be50023120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:51Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://05d6ab41bff1491b16d13fd321fc3f2ed79784b5945966cfa37735f570281b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:50Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://497485a3b5940835bde037d81157c05ecaaa45b09bdffff76bbc038694f328d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:51Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://dac49ea656d097873f0dcd29b532de929dd705148709a0184938669a24b2d0b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:51Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:08:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.510517 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34177974-8d82-49d2-a763-391d0df3bbd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://b8e5f28015a7839919e338305c619f5b1210d5514ab180e75985ccbd323c7240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:10:23Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-7bdcf4f5bd-7fjxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.522499 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f863fff9-286a-45fa-b8f0-8a86994b8440\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7w75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-5bb8f5cd97-xdvz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.551141 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs\") pod \"network-metrics-daemon-wlq8c\" (UID: \"94c19a90-c2c9-4236-98be-a0516dbb840b\") " pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.551313 5116 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.551410 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs podName:94c19a90-c2c9-4236-98be-a0516dbb840b nodeName:}" failed. No retries permitted until 2026-03-22 00:10:26.551391874 +0000 UTC m=+97.573693247 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs") pod "network-metrics-daemon-wlq8c" (UID: "94c19a90-c2c9-4236-98be-a0516dbb840b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.591611 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.591652 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.591661 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.591676 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.591692 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:24Z","lastTransitionTime":"2026-03-22T00:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.694224 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.694274 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.694286 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.694303 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.694316 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:24Z","lastTransitionTime":"2026-03-22T00:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.696391 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.696511 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.696532 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.696561 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.696561 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.696629 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.696710 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.696783 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlq8c" podUID="94c19a90-c2c9-4236-98be-a0516dbb840b" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.796965 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.797025 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.797066 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.797093 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.797112 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:24Z","lastTransitionTime":"2026-03-22T00:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.899930 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.899978 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.899990 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.900009 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.900021 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:24Z","lastTransitionTime":"2026-03-22T00:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.002208 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.002274 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.002289 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.002309 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.002320 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:25Z","lastTransitionTime":"2026-03-22T00:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.069725 5116 generic.go:358] "Generic (PLEG): container finished" podID="68dcbc21-b4ce-4285-9a4b-101724f82f33" containerID="3a773f1e36416b2c124cfa148df2f80bb14dcce04409d501d212d4552fb6fdab" exitCode=0 Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.069816 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bk75f" event={"ID":"68dcbc21-b4ce-4285-9a4b-101724f82f33","Type":"ContainerDied","Data":"3a773f1e36416b2c124cfa148df2f80bb14dcce04409d501d212d4552fb6fdab"} Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.074021 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerStarted","Data":"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f"} Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.074074 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerStarted","Data":"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62"} Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.074118 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerStarted","Data":"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93"} Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.074131 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerStarted","Data":"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519"} Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.074142 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerStarted","Data":"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d"} Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.104338 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.104376 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.104385 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.104400 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.104418 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:25Z","lastTransitionTime":"2026-03-22T00:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.118109 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=4.118088376 podStartE2EDuration="4.118088376s" podCreationTimestamp="2026-03-22 00:10:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:25.117376573 +0000 UTC m=+96.139677966" watchObservedRunningTime="2026-03-22 00:10:25.118088376 +0000 UTC m=+96.140389749" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.153414 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=3.1533991869999998 podStartE2EDuration="3.153399187s" podCreationTimestamp="2026-03-22 00:10:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:25.153018784 +0000 UTC m=+96.175320177" watchObservedRunningTime="2026-03-22 00:10:25.153399187 +0000 UTC m=+96.175700560" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.204149 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-9sq6c" podStartSLOduration=75.204125515 podStartE2EDuration="1m15.204125515s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:25.204103485 +0000 UTC m=+96.226404858" watchObservedRunningTime="2026-03-22 00:10:25.204125515 +0000 UTC m=+96.226426888" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.206424 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.206461 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.206471 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.206487 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.206498 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:25Z","lastTransitionTime":"2026-03-22T00:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.223082 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" podStartSLOduration=75.223058547 podStartE2EDuration="1m15.223058547s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:25.222715796 +0000 UTC m=+96.245017169" watchObservedRunningTime="2026-03-22 00:10:25.223058547 +0000 UTC m=+96.245359920" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.237013 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-2rwjp" podStartSLOduration=75.236996419 podStartE2EDuration="1m15.236996419s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:25.236370749 +0000 UTC m=+96.258672122" watchObservedRunningTime="2026-03-22 00:10:25.236996419 +0000 UTC m=+96.259297792" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.281994 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=4.281961866 podStartE2EDuration="4.281961866s" podCreationTimestamp="2026-03-22 00:10:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:25.271186973 +0000 UTC m=+96.293488356" watchObservedRunningTime="2026-03-22 00:10:25.281961866 +0000 UTC m=+96.304263239" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.304235 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-nwnjb" podStartSLOduration=76.303384675 podStartE2EDuration="1m16.303384675s" podCreationTimestamp="2026-03-22 00:09:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:25.282884935 +0000 UTC m=+96.305186308" watchObservedRunningTime="2026-03-22 00:10:25.303384675 +0000 UTC m=+96.325686038" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.304403 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=3.304388618 podStartE2EDuration="3.304388618s" podCreationTimestamp="2026-03-22 00:10:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:25.300555996 +0000 UTC m=+96.322857369" watchObservedRunningTime="2026-03-22 00:10:25.304388618 +0000 UTC m=+96.326689991" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.308680 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.308713 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.308722 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.308733 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.308742 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:25Z","lastTransitionTime":"2026-03-22T00:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.411398 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.411560 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.411572 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.411586 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.411610 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:25Z","lastTransitionTime":"2026-03-22T00:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.514810 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.514854 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.514869 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.514883 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.514894 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:25Z","lastTransitionTime":"2026-03-22T00:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.616314 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.616346 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.616355 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.616368 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.616378 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:25Z","lastTransitionTime":"2026-03-22T00:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.718086 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.718124 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.718132 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.718145 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.718157 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:25Z","lastTransitionTime":"2026-03-22T00:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.820454 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.820782 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.820796 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.820813 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.820825 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:25Z","lastTransitionTime":"2026-03-22T00:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.923295 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.923350 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.923362 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.923381 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.923396 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:25Z","lastTransitionTime":"2026-03-22T00:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.024955 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.025015 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.025029 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.025047 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.025062 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:26Z","lastTransitionTime":"2026-03-22T00:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.078401 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5jnd7" event={"ID":"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc","Type":"ContainerStarted","Data":"c6721d40927bfc9e64db3dad19ca523af11553569d2a7c235c5ff4c77b73a4b7"} Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.080071 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bk75f" event={"ID":"68dcbc21-b4ce-4285-9a4b-101724f82f33","Type":"ContainerStarted","Data":"c6ce593ebbff0cbc20d49368fa0db752daf86b2c7c0ce84e6c3ed7de07047717"} Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.083929 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerStarted","Data":"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85"} Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.102916 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podStartSLOduration=76.102856763 podStartE2EDuration="1m16.102856763s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:25.388094554 +0000 UTC m=+96.410395957" watchObservedRunningTime="2026-03-22 00:10:26.102856763 +0000 UTC m=+97.125158136" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.127187 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.127233 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.127243 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.127260 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.127271 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:26Z","lastTransitionTime":"2026-03-22T00:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.229854 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.229900 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.229910 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.229926 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.229936 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:26Z","lastTransitionTime":"2026-03-22T00:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.331978 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.332024 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.332035 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.332050 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.332060 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:26Z","lastTransitionTime":"2026-03-22T00:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.379304 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7w75\" (UniqueName: \"kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75\") pod \"network-check-source-5bb8f5cd97-xdvz5\" (UID: \"f863fff9-286a-45fa-b8f0-8a86994b8440\") " pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.379361 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.379387 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.379415 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwt8b\" (UniqueName: \"kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b\") pod \"network-check-target-fhkjl\" (UID: \"17b87002-b798-480a-8e17-83053d698239\") " pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.379583 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.379609 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.379622 5116 projected.go:194] Error preparing data for projected volume kube-api-access-gwt8b for pod openshift-network-diagnostics/network-check-target-fhkjl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.379675 5116 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.379734 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.379770 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.379776 5116 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.379783 5116 projected.go:194] Error preparing data for projected volume kube-api-access-l7w75 for pod openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.379685 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b podName:17b87002-b798-480a-8e17-83053d698239 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:30.379666878 +0000 UTC m=+101.401968261 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-gwt8b" (UniqueName: "kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b") pod "network-check-target-fhkjl" (UID: "17b87002-b798-480a-8e17-83053d698239") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.379867 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:30.379850523 +0000 UTC m=+101.402151896 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.379892 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:30.379879374 +0000 UTC m=+101.402180787 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.379907 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75 podName:f863fff9-286a-45fa-b8f0-8a86994b8440 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:30.379900524 +0000 UTC m=+101.402201967 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-l7w75" (UniqueName: "kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75") pod "network-check-source-5bb8f5cd97-xdvz5" (UID: "f863fff9-286a-45fa-b8f0-8a86994b8440") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.435096 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.435185 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.435215 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.435236 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.435248 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:26Z","lastTransitionTime":"2026-03-22T00:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.480156 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.480347 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:30.480326822 +0000 UTC m=+101.502628195 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.537492 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.537543 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.537555 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.537573 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.537585 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:26Z","lastTransitionTime":"2026-03-22T00:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.581439 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs\") pod \"network-metrics-daemon-wlq8c\" (UID: \"94c19a90-c2c9-4236-98be-a0516dbb840b\") " pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.581626 5116 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.581717 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs podName:94c19a90-c2c9-4236-98be-a0516dbb840b nodeName:}" failed. No retries permitted until 2026-03-22 00:10:30.581697898 +0000 UTC m=+101.603999271 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs") pod "network-metrics-daemon-wlq8c" (UID: "94c19a90-c2c9-4236-98be-a0516dbb840b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.639316 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.639371 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.639383 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.639400 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.639412 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:26Z","lastTransitionTime":"2026-03-22T00:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.696453 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.696530 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.696623 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.696624 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.696771 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.696835 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.696974 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlq8c" podUID="94c19a90-c2c9-4236-98be-a0516dbb840b" Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.697243 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.742377 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.742431 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.742443 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.742461 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.742477 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:26Z","lastTransitionTime":"2026-03-22T00:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.845090 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.845141 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.845152 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.845194 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.845208 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:26Z","lastTransitionTime":"2026-03-22T00:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.948475 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.948544 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.948557 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.948577 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.948590 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:26Z","lastTransitionTime":"2026-03-22T00:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.050640 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.050699 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.050714 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.050731 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.050743 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:27Z","lastTransitionTime":"2026-03-22T00:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.089090 5116 generic.go:358] "Generic (PLEG): container finished" podID="68dcbc21-b4ce-4285-9a4b-101724f82f33" containerID="c6ce593ebbff0cbc20d49368fa0db752daf86b2c7c0ce84e6c3ed7de07047717" exitCode=0 Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.089206 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bk75f" event={"ID":"68dcbc21-b4ce-4285-9a4b-101724f82f33","Type":"ContainerDied","Data":"c6ce593ebbff0cbc20d49368fa0db752daf86b2c7c0ce84e6c3ed7de07047717"} Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.152942 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.153220 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.153232 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.153244 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.153254 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:27Z","lastTransitionTime":"2026-03-22T00:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.255817 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.255860 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.255871 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.255885 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.255895 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:27Z","lastTransitionTime":"2026-03-22T00:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.358281 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.358319 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.358348 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.358363 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.358372 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:27Z","lastTransitionTime":"2026-03-22T00:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.460780 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.460829 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.460843 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.460859 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.460872 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:27Z","lastTransitionTime":"2026-03-22T00:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.563224 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.563270 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.563281 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.563297 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.563309 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:27Z","lastTransitionTime":"2026-03-22T00:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.665424 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.665480 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.665491 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.665508 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.665520 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:27Z","lastTransitionTime":"2026-03-22T00:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.766792 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.766844 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.766859 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.766876 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.766890 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:27Z","lastTransitionTime":"2026-03-22T00:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.869187 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.869234 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.869248 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.869264 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.869278 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:27Z","lastTransitionTime":"2026-03-22T00:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.971189 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.971229 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.971240 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.971256 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.971269 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:27Z","lastTransitionTime":"2026-03-22T00:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.073388 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.073798 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.073812 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.073829 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.073842 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:28Z","lastTransitionTime":"2026-03-22T00:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.094653 5116 generic.go:358] "Generic (PLEG): container finished" podID="68dcbc21-b4ce-4285-9a4b-101724f82f33" containerID="865bb892d10d2cc98809e713630d1eee197b60ebea411ebed87f389b4f38d103" exitCode=0 Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.094745 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bk75f" event={"ID":"68dcbc21-b4ce-4285-9a4b-101724f82f33","Type":"ContainerDied","Data":"865bb892d10d2cc98809e713630d1eee197b60ebea411ebed87f389b4f38d103"} Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.101402 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerStarted","Data":"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325"} Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.171851 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.171911 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.171924 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.171973 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.171993 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:28Z","lastTransitionTime":"2026-03-22T00:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.186921 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.186980 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.186993 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.187014 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.187025 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:28Z","lastTransitionTime":"2026-03-22T00:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.215115 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r"] Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.221604 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.223570 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-version\"/\"default-dockercfg-hqpm5\"" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.223913 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-version\"/\"cluster-version-operator-serving-cert\"" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.224037 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-version\"/\"kube-root-ca.crt\"" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.224425 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-version\"/\"openshift-service-ca.crt\"" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.303841 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/cc1de7fa-cee5-4f90-80c9-e9cba187456d-etc-ssl-certs\") pod \"cluster-version-operator-7c9b9cfd6-jrw2r\" (UID: \"cc1de7fa-cee5-4f90-80c9-e9cba187456d\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.303920 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc1de7fa-cee5-4f90-80c9-e9cba187456d-serving-cert\") pod \"cluster-version-operator-7c9b9cfd6-jrw2r\" (UID: \"cc1de7fa-cee5-4f90-80c9-e9cba187456d\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.304117 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/cc1de7fa-cee5-4f90-80c9-e9cba187456d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7c9b9cfd6-jrw2r\" (UID: \"cc1de7fa-cee5-4f90-80c9-e9cba187456d\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.304221 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cc1de7fa-cee5-4f90-80c9-e9cba187456d-service-ca\") pod \"cluster-version-operator-7c9b9cfd6-jrw2r\" (UID: \"cc1de7fa-cee5-4f90-80c9-e9cba187456d\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.304269 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc1de7fa-cee5-4f90-80c9-e9cba187456d-kube-api-access\") pod \"cluster-version-operator-7c9b9cfd6-jrw2r\" (UID: \"cc1de7fa-cee5-4f90-80c9-e9cba187456d\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.405558 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc1de7fa-cee5-4f90-80c9-e9cba187456d-serving-cert\") pod \"cluster-version-operator-7c9b9cfd6-jrw2r\" (UID: \"cc1de7fa-cee5-4f90-80c9-e9cba187456d\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.405649 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/cc1de7fa-cee5-4f90-80c9-e9cba187456d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7c9b9cfd6-jrw2r\" (UID: \"cc1de7fa-cee5-4f90-80c9-e9cba187456d\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.405675 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cc1de7fa-cee5-4f90-80c9-e9cba187456d-service-ca\") pod \"cluster-version-operator-7c9b9cfd6-jrw2r\" (UID: \"cc1de7fa-cee5-4f90-80c9-e9cba187456d\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.405703 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc1de7fa-cee5-4f90-80c9-e9cba187456d-kube-api-access\") pod \"cluster-version-operator-7c9b9cfd6-jrw2r\" (UID: \"cc1de7fa-cee5-4f90-80c9-e9cba187456d\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.405762 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/cc1de7fa-cee5-4f90-80c9-e9cba187456d-etc-ssl-certs\") pod \"cluster-version-operator-7c9b9cfd6-jrw2r\" (UID: \"cc1de7fa-cee5-4f90-80c9-e9cba187456d\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.405825 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/cc1de7fa-cee5-4f90-80c9-e9cba187456d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7c9b9cfd6-jrw2r\" (UID: \"cc1de7fa-cee5-4f90-80c9-e9cba187456d\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.405872 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/cc1de7fa-cee5-4f90-80c9-e9cba187456d-etc-ssl-certs\") pod \"cluster-version-operator-7c9b9cfd6-jrw2r\" (UID: \"cc1de7fa-cee5-4f90-80c9-e9cba187456d\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.406687 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cc1de7fa-cee5-4f90-80c9-e9cba187456d-service-ca\") pod \"cluster-version-operator-7c9b9cfd6-jrw2r\" (UID: \"cc1de7fa-cee5-4f90-80c9-e9cba187456d\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.419036 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc1de7fa-cee5-4f90-80c9-e9cba187456d-serving-cert\") pod \"cluster-version-operator-7c9b9cfd6-jrw2r\" (UID: \"cc1de7fa-cee5-4f90-80c9-e9cba187456d\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.430483 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc1de7fa-cee5-4f90-80c9-e9cba187456d-kube-api-access\") pod \"cluster-version-operator-7c9b9cfd6-jrw2r\" (UID: \"cc1de7fa-cee5-4f90-80c9-e9cba187456d\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.544346 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" Mar 22 00:10:28 crc kubenswrapper[5116]: W0322 00:10:28.559370 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc1de7fa_cee5_4f90_80c9_e9cba187456d.slice/crio-d4ede4220a98a9362f63deee58b33a8f5ceb6d2314eba0d669dc951f2c019bbf WatchSource:0}: Error finding container d4ede4220a98a9362f63deee58b33a8f5ceb6d2314eba0d669dc951f2c019bbf: Status 404 returned error can't find the container with id d4ede4220a98a9362f63deee58b33a8f5ceb6d2314eba0d669dc951f2c019bbf Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.688859 5116 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.696762 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.696762 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.696853 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.696860 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:28 crc kubenswrapper[5116]: E0322 00:10:28.697001 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Mar 22 00:10:28 crc kubenswrapper[5116]: E0322 00:10:28.697115 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Mar 22 00:10:28 crc kubenswrapper[5116]: E0322 00:10:28.697191 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlq8c" podUID="94c19a90-c2c9-4236-98be-a0516dbb840b" Mar 22 00:10:28 crc kubenswrapper[5116]: E0322 00:10:28.697262 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.698812 5116 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Mar 22 00:10:29 crc kubenswrapper[5116]: I0322 00:10:29.105919 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" event={"ID":"cc1de7fa-cee5-4f90-80c9-e9cba187456d","Type":"ContainerStarted","Data":"d4ede4220a98a9362f63deee58b33a8f5ceb6d2314eba0d669dc951f2c019bbf"} Mar 22 00:10:29 crc kubenswrapper[5116]: I0322 00:10:29.110138 5116 generic.go:358] "Generic (PLEG): container finished" podID="68dcbc21-b4ce-4285-9a4b-101724f82f33" containerID="7523ff5e45ba0c266da43232745d320ee15d7059b1b8414c9d98217b5eabcc7f" exitCode=0 Mar 22 00:10:29 crc kubenswrapper[5116]: I0322 00:10:29.110225 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bk75f" event={"ID":"68dcbc21-b4ce-4285-9a4b-101724f82f33","Type":"ContainerDied","Data":"7523ff5e45ba0c266da43232745d320ee15d7059b1b8414c9d98217b5eabcc7f"} Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.116713 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bk75f" event={"ID":"68dcbc21-b4ce-4285-9a4b-101724f82f33","Type":"ContainerStarted","Data":"00794bbb5aae6e05bd0e50af39e0b4d4e26deff3b187d7676d95950bdffb8dd8"} Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.121364 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerStarted","Data":"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740"} Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.121644 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.121702 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.121712 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.123030 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" event={"ID":"cc1de7fa-cee5-4f90-80c9-e9cba187456d","Type":"ContainerStarted","Data":"e5174ff347d5ce3b36e185f0c9b14090e9c9bb0ff5f625d98f27d20f17fc867e"} Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.185747 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.196158 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.202288 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" podStartSLOduration=80.202269003 podStartE2EDuration="1m20.202269003s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:30.172330303 +0000 UTC m=+101.194631696" watchObservedRunningTime="2026-03-22 00:10:30.202269003 +0000 UTC m=+101.224570386" Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.202513 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" podStartSLOduration=80.202508151 podStartE2EDuration="1m20.202508151s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:30.201377855 +0000 UTC m=+101.223679228" watchObservedRunningTime="2026-03-22 00:10:30.202508151 +0000 UTC m=+101.224809524" Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.433725 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7w75\" (UniqueName: \"kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75\") pod \"network-check-source-5bb8f5cd97-xdvz5\" (UID: \"f863fff9-286a-45fa-b8f0-8a86994b8440\") " pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.434114 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.434145 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.434198 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwt8b\" (UniqueName: \"kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b\") pod \"network-check-target-fhkjl\" (UID: \"17b87002-b798-480a-8e17-83053d698239\") " pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.434368 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.434385 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.434394 5116 projected.go:194] Error preparing data for projected volume kube-api-access-l7w75 for pod openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.434406 5116 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.434433 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.434459 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.434472 5116 projected.go:194] Error preparing data for projected volume kube-api-access-gwt8b for pod openshift-network-diagnostics/network-check-target-fhkjl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.434446 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75 podName:f863fff9-286a-45fa-b8f0-8a86994b8440 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:38.43443243 +0000 UTC m=+109.456733803 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-l7w75" (UniqueName: "kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75") pod "network-check-source-5bb8f5cd97-xdvz5" (UID: "f863fff9-286a-45fa-b8f0-8a86994b8440") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.434527 5116 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.434528 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:38.434510932 +0000 UTC m=+109.456812305 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.434583 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b podName:17b87002-b798-480a-8e17-83053d698239 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:38.434569254 +0000 UTC m=+109.456870677 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-gwt8b" (UniqueName: "kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b") pod "network-check-target-fhkjl" (UID: "17b87002-b798-480a-8e17-83053d698239") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.434605 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:38.434591005 +0000 UTC m=+109.456892408 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.535233 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.535436 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:38.535404964 +0000 UTC m=+109.557706337 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.637233 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs\") pod \"network-metrics-daemon-wlq8c\" (UID: \"94c19a90-c2c9-4236-98be-a0516dbb840b\") " pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.637458 5116 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.637559 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs podName:94c19a90-c2c9-4236-98be-a0516dbb840b nodeName:}" failed. No retries permitted until 2026-03-22 00:10:38.637535325 +0000 UTC m=+109.659836698 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs") pod "network-metrics-daemon-wlq8c" (UID: "94c19a90-c2c9-4236-98be-a0516dbb840b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.697531 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.697599 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.697668 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.697780 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlq8c" podUID="94c19a90-c2c9-4236-98be-a0516dbb840b" Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.697849 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.697940 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.698004 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.698057 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Mar 22 00:10:31 crc kubenswrapper[5116]: I0322 00:10:31.128795 5116 generic.go:358] "Generic (PLEG): container finished" podID="68dcbc21-b4ce-4285-9a4b-101724f82f33" containerID="00794bbb5aae6e05bd0e50af39e0b4d4e26deff3b187d7676d95950bdffb8dd8" exitCode=0 Mar 22 00:10:31 crc kubenswrapper[5116]: I0322 00:10:31.129457 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bk75f" event={"ID":"68dcbc21-b4ce-4285-9a4b-101724f82f33","Type":"ContainerDied","Data":"00794bbb5aae6e05bd0e50af39e0b4d4e26deff3b187d7676d95950bdffb8dd8"} Mar 22 00:10:32 crc kubenswrapper[5116]: I0322 00:10:32.134541 5116 generic.go:358] "Generic (PLEG): container finished" podID="68dcbc21-b4ce-4285-9a4b-101724f82f33" containerID="7c538fd9b69d6bb7a46afcbfeec835f156752a4d080e583406021eb7d30194a7" exitCode=0 Mar 22 00:10:32 crc kubenswrapper[5116]: I0322 00:10:32.134604 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bk75f" event={"ID":"68dcbc21-b4ce-4285-9a4b-101724f82f33","Type":"ContainerDied","Data":"7c538fd9b69d6bb7a46afcbfeec835f156752a4d080e583406021eb7d30194a7"} Mar 22 00:10:32 crc kubenswrapper[5116]: I0322 00:10:32.244283 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wlq8c"] Mar 22 00:10:32 crc kubenswrapper[5116]: I0322 00:10:32.244480 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:32 crc kubenswrapper[5116]: E0322 00:10:32.244622 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlq8c" podUID="94c19a90-c2c9-4236-98be-a0516dbb840b" Mar 22 00:10:32 crc kubenswrapper[5116]: I0322 00:10:32.696879 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:32 crc kubenswrapper[5116]: I0322 00:10:32.696925 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:32 crc kubenswrapper[5116]: E0322 00:10:32.697308 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Mar 22 00:10:32 crc kubenswrapper[5116]: E0322 00:10:32.697390 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Mar 22 00:10:32 crc kubenswrapper[5116]: I0322 00:10:32.696963 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:32 crc kubenswrapper[5116]: E0322 00:10:32.697473 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Mar 22 00:10:33 crc kubenswrapper[5116]: I0322 00:10:33.141276 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bk75f" event={"ID":"68dcbc21-b4ce-4285-9a4b-101724f82f33","Type":"ContainerStarted","Data":"42f55028ebe0714ec8dc6e34ea7f6aaa1c102bd83b7f126abea22b6b5c324427"} Mar 22 00:10:33 crc kubenswrapper[5116]: I0322 00:10:33.697195 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:33 crc kubenswrapper[5116]: E0322 00:10:33.697387 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlq8c" podUID="94c19a90-c2c9-4236-98be-a0516dbb840b" Mar 22 00:10:34 crc kubenswrapper[5116]: I0322 00:10:34.696803 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:34 crc kubenswrapper[5116]: I0322 00:10:34.696813 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:34 crc kubenswrapper[5116]: E0322 00:10:34.696954 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Mar 22 00:10:34 crc kubenswrapper[5116]: E0322 00:10:34.697084 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Mar 22 00:10:34 crc kubenswrapper[5116]: I0322 00:10:34.697142 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:34 crc kubenswrapper[5116]: E0322 00:10:34.697255 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Mar 22 00:10:35 crc kubenswrapper[5116]: I0322 00:10:35.696807 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:35 crc kubenswrapper[5116]: E0322 00:10:35.696956 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlq8c" podUID="94c19a90-c2c9-4236-98be-a0516dbb840b" Mar 22 00:10:36 crc kubenswrapper[5116]: I0322 00:10:36.696713 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:36 crc kubenswrapper[5116]: E0322 00:10:36.697143 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Mar 22 00:10:36 crc kubenswrapper[5116]: I0322 00:10:36.696778 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:36 crc kubenswrapper[5116]: I0322 00:10:36.696746 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:36 crc kubenswrapper[5116]: E0322 00:10:36.697270 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Mar 22 00:10:36 crc kubenswrapper[5116]: E0322 00:10:36.697453 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.534690 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeReady" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.534818 5116 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.569504 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-bk75f" podStartSLOduration=87.569483485 podStartE2EDuration="1m27.569483485s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:33.163539568 +0000 UTC m=+104.185840941" watchObservedRunningTime="2026-03-22 00:10:37.569483485 +0000 UTC m=+108.591784858" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.570133 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-9ddfb9f55-m5dds"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.577540 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-8596bd845d-f59q2"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.577780 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.579775 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver\"/\"etcd-client\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.580091 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver\"/\"openshift-apiserver-sa-dockercfg-4zqgh\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.583510 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"openshift-service-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.584439 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"audit-1\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.584879 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver\"/\"serving-cert\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.585045 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-65b6cccf98-9kdkj"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.587018 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver\"/\"encryption-config-1\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.587402 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"kube-root-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.587722 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"config\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.587962 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"etcd-serving-ca\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.588296 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"image-import-ca\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.588976 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-755bb95488-w2nq2"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.590427 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.593382 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-54c688565-5drp9"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.593723 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.594282 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"trusted-ca-bundle\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.597308 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29568960-tjk88"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.598393 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.598427 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.600432 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.603369 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-747b44746d-cb5p2"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.606078 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-vdh4n"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.607152 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-747b44746d-cb5p2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.607757 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29568960-tjk88" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.611346 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.612329 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.613664 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"audit-1\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.614658 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-oauth-apiserver\"/\"oauth-apiserver-sa-dockercfg-qqw4z\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.614731 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-dfx6t"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.614765 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"openshift-service-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.614917 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"trusted-ca-bundle\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.615073 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"etcd-serving-ca\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.615212 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-oauth-apiserver\"/\"encryption-config-1\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.623642 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"config\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.624612 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"kube-root-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.624916 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-api\"/\"kube-rbac-proxy\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.625112 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-api\"/\"machine-api-operator-images\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.625316 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-controller-manager\"/\"openshift-controller-manager-sa-dockercfg-djmfg\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.625504 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-machine-approver\"/\"machine-approver-config\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.625684 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.625858 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"client-ca\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.626023 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"serviceca\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.626272 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"pruner-dockercfg-rs58m\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.626513 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-oauth-apiserver\"/\"serving-cert\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.627085 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-vdh4n" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.627299 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-api\"/\"openshift-service-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.628042 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"openshift-service-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.628099 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-api\"/\"machine-api-operator-dockercfg-6n5ln\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.628544 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-machine-approver\"/\"kube-rbac-proxy\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.629362 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"kube-root-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.629606 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-machine-approver\"/\"kube-root-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.629865 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.634216 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-mdwwj\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.636942 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-machine-approver\"/\"machine-approver-tls\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.641833 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.643088 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-controller-manager\"/\"serving-cert\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.648188 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-machine-approver\"/\"openshift-service-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.648331 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-machine-approver\"/\"machine-approver-sa-dockercfg-wzhvk\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.648531 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-api\"/\"machine-api-operator-tls\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.649272 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-route-controller-manager\"/\"route-controller-manager-sa-dockercfg-mmcpt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.649549 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-route-controller-manager\"/\"serving-cert\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.649925 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-oauth-apiserver\"/\"etcd-client\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.650961 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-api\"/\"kube-root-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.651203 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-route-controller-manager\"/\"config\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.651474 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-route-controller-manager\"/\"openshift-service-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.651683 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-route-controller-manager\"/\"kube-root-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.653685 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-route-controller-manager\"/\"client-ca\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.653700 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver-operator\"/\"openshift-apiserver-operator-config\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.655090 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-67c89758df-vnd4f"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.661256 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"openshift-global-ca\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.662435 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-5777786469-wb6r8"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.662547 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-dfx6t" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.663416 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.663796 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-67c89758df-vnd4f" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.663852 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"kube-root-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.663992 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver-operator\"/\"kube-root-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.664000 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"authentication-operator-config\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.664227 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver-operator\"/\"openshift-apiserver-operator-serving-cert\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.664578 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.664980 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication-operator\"/\"authentication-operator-dockercfg-6tbpn\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.666689 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"service-ca-bundle\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.666875 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.667003 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication-operator\"/\"serving-cert\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.667199 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver-operator\"/\"openshift-apiserver-operator-dockercfg-6c46w\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.667257 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-66458b6674-8qfhd"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.667501 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.670236 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.670400 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.670548 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.670894 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-jmhxf\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.671142 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.671281 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-config-operator\"/\"config-operator-serving-cert\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.671363 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-config-operator\"/\"kube-root-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.671594 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-kl6m8\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.671753 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-config-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.671937 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-config-operator\"/\"openshift-config-operator-dockercfg-sjn6s\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.672066 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.672227 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.672359 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.677296 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-64d44f6ddf-9g5sg"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.678301 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.678397 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.680120 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"trusted-ca-bundle\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.682756 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.687614 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-controller-manager-operator\"/\"openshift-controller-manager-operator-serving-cert\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.688076 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager-operator\"/\"openshift-controller-manager-operator-config\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.688317 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-controller-manager-operator\"/\"openshift-controller-manager-operator-dockercfg-jcmfj\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.689710 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager-operator\"/\"kube-root-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.703037 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.707505 5116 scope.go:117] "RemoveContainer" containerID="4ec1f0e4053fa1e136a94ad86e588cc0fd43b29333734b120fe3d6175c1913a8" Mar 22 00:10:37 crc kubenswrapper[5116]: E0322 00:10:37.708747 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.730246 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66587d64c8-zwkhp"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.731602 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.731736 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.738623 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.738784 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-65b6cccf98-9kdkj"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.740489 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.750327 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.750441 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.750516 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-8dkm8\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.750343 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.750549 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.750625 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.755245 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-cliconfig\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.755453 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-service-ca\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.755595 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"openshift-service-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.755709 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-user-template-provider-selection\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.755756 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"kube-root-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.755997 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a442ae21-7eff-4990-998f-27afcb839a6c-config\") pod \"openshift-apiserver-operator-846cbfc458-vdh4n\" (UID: \"a442ae21-7eff-4990-998f-27afcb839a6c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-vdh4n" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.756030 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbrfl\" (UniqueName: \"kubernetes.io/projected/a442ae21-7eff-4990-998f-27afcb839a6c-kube-api-access-hbrfl\") pod \"openshift-apiserver-operator-846cbfc458-vdh4n\" (UID: \"a442ae21-7eff-4990-998f-27afcb839a6c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-vdh4n" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.756037 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-session\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.756234 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.756279 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-router-certs\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.756534 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-serving-cert\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.756914 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-user-template-login\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.756936 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"audit\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.757431 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9648k\" (UniqueName: \"kubernetes.io/projected/bac29c51-c815-4827-bd3d-c74f2e31f842-kube-api-access-9648k\") pod \"authentication-operator-7f5c659b84-4vmb4\" (UID: \"bac29c51-c815-4827-bd3d-c74f2e31f842\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.757462 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f51f3b4-6887-42b5-ad77-5a2f349a162a-config\") pod \"route-controller-manager-776cdc94d6-fw5k5\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.757497 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-etcd-serving-ca\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.757525 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7xbb\" (UniqueName: \"kubernetes.io/projected/9884d9ba-fbeb-40db-8105-de302262478b-kube-api-access-c7xbb\") pod \"machine-api-operator-755bb95488-w2nq2\" (UID: \"9884d9ba-fbeb-40db-8105-de302262478b\") " pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.757575 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pngr\" (UniqueName: \"kubernetes.io/projected/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-kube-api-access-4pngr\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.757646 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-config\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.757674 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bd9a9e58-4363-4eaf-a27c-21aacaeea0a4-auth-proxy-config\") pod \"machine-approver-54c688565-5drp9\" (UID: \"bd9a9e58-4363-4eaf-a27c-21aacaeea0a4\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.757718 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp728\" (UniqueName: \"kubernetes.io/projected/5f51f3b4-6887-42b5-ad77-5a2f349a162a-kube-api-access-qp728\") pod \"route-controller-manager-776cdc94d6-fw5k5\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.757772 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9lrd\" (UniqueName: \"kubernetes.io/projected/f8b3ade2-2521-43a2-a5fc-2c33d19f3a58-kube-api-access-b9lrd\") pod \"image-pruner-29568960-tjk88\" (UID: \"f8b3ade2-2521-43a2-a5fc-2c33d19f3a58\") " pod="openshift-image-registry/image-pruner-29568960-tjk88" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.757846 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-serving-cert\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.757869 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"oauth-openshift-dockercfg-d2bf2\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.757882 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd9a9e58-4363-4eaf-a27c-21aacaeea0a4-config\") pod \"machine-approver-54c688565-5drp9\" (UID: \"bd9a9e58-4363-4eaf-a27c-21aacaeea0a4\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.757912 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-serving-cert\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.757933 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9884d9ba-fbeb-40db-8105-de302262478b-config\") pod \"machine-api-operator-755bb95488-w2nq2\" (UID: \"9884d9ba-fbeb-40db-8105-de302262478b\") " pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758001 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq7tp\" (UniqueName: \"kubernetes.io/projected/bd9a9e58-4363-4eaf-a27c-21aacaeea0a4-kube-api-access-tq7tp\") pod \"machine-approver-54c688565-5drp9\" (UID: \"bd9a9e58-4363-4eaf-a27c-21aacaeea0a4\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758068 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-etcd-client\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758108 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrpct\" (UniqueName: \"kubernetes.io/projected/45fa64e1-27bb-4b1f-bf62-4fa08b5dcfa0-kube-api-access-xrpct\") pod \"downloads-747b44746d-cb5p2\" (UID: \"45fa64e1-27bb-4b1f-bf62-4fa08b5dcfa0\") " pod="openshift-console/downloads-747b44746d-cb5p2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758143 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-encryption-config\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758194 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-trusted-ca-bundle\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758220 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9884d9ba-fbeb-40db-8105-de302262478b-images\") pod \"machine-api-operator-755bb95488-w2nq2\" (UID: \"9884d9ba-fbeb-40db-8105-de302262478b\") " pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758241 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-audit-policies\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758280 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-audit-dir\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758362 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzqwb\" (UniqueName: \"kubernetes.io/projected/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-kube-api-access-bzqwb\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758412 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9884d9ba-fbeb-40db-8105-de302262478b-machine-api-operator-tls\") pod \"machine-api-operator-755bb95488-w2nq2\" (UID: \"9884d9ba-fbeb-40db-8105-de302262478b\") " pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758477 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a442ae21-7eff-4990-998f-27afcb839a6c-serving-cert\") pod \"openshift-apiserver-operator-846cbfc458-vdh4n\" (UID: \"a442ae21-7eff-4990-998f-27afcb839a6c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-vdh4n" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758495 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5f51f3b4-6887-42b5-ad77-5a2f349a162a-client-ca\") pod \"route-controller-manager-776cdc94d6-fw5k5\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758509 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5f51f3b4-6887-42b5-ad77-5a2f349a162a-tmp\") pod \"route-controller-manager-776cdc94d6-fw5k5\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758551 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f8b3ade2-2521-43a2-a5fc-2c33d19f3a58-serviceca\") pod \"image-pruner-29568960-tjk88\" (UID: \"f8b3ade2-2521-43a2-a5fc-2c33d19f3a58\") " pod="openshift-image-registry/image-pruner-29568960-tjk88" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758566 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-image-import-ca\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758615 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-audit-dir\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758648 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-proxy-ca-bundles\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758675 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvntc\" (UniqueName: \"kubernetes.io/projected/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-kube-api-access-wvntc\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758712 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bac29c51-c815-4827-bd3d-c74f2e31f842-config\") pod \"authentication-operator-7f5c659b84-4vmb4\" (UID: \"bac29c51-c815-4827-bd3d-c74f2e31f842\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758744 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-node-pullsecrets\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758762 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-config\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758778 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bac29c51-c815-4827-bd3d-c74f2e31f842-serving-cert\") pod \"authentication-operator-7f5c659b84-4vmb4\" (UID: \"bac29c51-c815-4827-bd3d-c74f2e31f842\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758830 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bac29c51-c815-4827-bd3d-c74f2e31f842-service-ca-bundle\") pod \"authentication-operator-7f5c659b84-4vmb4\" (UID: \"bac29c51-c815-4827-bd3d-c74f2e31f842\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758886 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f51f3b4-6887-42b5-ad77-5a2f349a162a-serving-cert\") pod \"route-controller-manager-776cdc94d6-fw5k5\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758930 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-encryption-config\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758957 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-tmp\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.759027 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/bd9a9e58-4363-4eaf-a27c-21aacaeea0a4-machine-approver-tls\") pod \"machine-approver-54c688565-5drp9\" (UID: \"bd9a9e58-4363-4eaf-a27c-21aacaeea0a4\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.759081 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-serving-cert\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.759098 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-client-ca\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.759114 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-audit\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.759143 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-etcd-serving-ca\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.759158 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bac29c51-c815-4827-bd3d-c74f2e31f842-trusted-ca-bundle\") pod \"authentication-operator-7f5c659b84-4vmb4\" (UID: \"bac29c51-c815-4827-bd3d-c74f2e31f842\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.759202 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-etcd-client\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.759224 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-trusted-ca-bundle\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.765025 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-trusted-ca-bundle\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.768207 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.768763 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-799b87ffcd-42tp2"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.771491 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.777657 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-ocp-branding-template\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.777792 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-68cf44c8b8-2jlxw"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.781577 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-user-template-error\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.781631 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.781827 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-799b87ffcd-42tp2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.781887 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.783036 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-user-idp-0-file-data\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.790647 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.790813 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.795004 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.795270 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.797104 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.797469 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.798998 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-t8n29\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.800694 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-ms24k"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.800983 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.803912 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-69db94689b-bkst6"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.804367 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-ms24k" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.809921 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-tlh4v"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.810000 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-69db94689b-bkst6" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.816897 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.817013 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-tlh4v" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.822744 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.822926 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.828570 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-747b44746d-cb5p2"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.828601 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.828732 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.833077 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-866fcbc849-m5rc6"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.833380 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.835839 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-5777786469-wb6r8"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.835872 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.835886 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-8596bd845d-f59q2"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.835900 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.836384 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-m5rc6" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.839211 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.839496 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.854694 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.855053 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.858293 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-dfx6t"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.858324 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.858337 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-vdh4n"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.858349 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-9ddfb9f55-m5dds"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.858362 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.858376 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-74545575db-z8df4"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.858510 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.859911 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f8b3ade2-2521-43a2-a5fc-2c33d19f3a58-serviceca\") pod \"image-pruner-29568960-tjk88\" (UID: \"f8b3ade2-2521-43a2-a5fc-2c33d19f3a58\") " pod="openshift-image-registry/image-pruner-29568960-tjk88" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.859941 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-image-import-ca\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.860069 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-audit-dir\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.860102 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-proxy-ca-bundles\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.860226 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-audit-dir\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.860238 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wvntc\" (UniqueName: \"kubernetes.io/projected/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-kube-api-access-wvntc\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.860480 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bac29c51-c815-4827-bd3d-c74f2e31f842-config\") pod \"authentication-operator-7f5c659b84-4vmb4\" (UID: \"bac29c51-c815-4827-bd3d-c74f2e31f842\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.860496 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-node-pullsecrets\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.860530 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-config\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.860549 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bac29c51-c815-4827-bd3d-c74f2e31f842-serving-cert\") pod \"authentication-operator-7f5c659b84-4vmb4\" (UID: \"bac29c51-c815-4827-bd3d-c74f2e31f842\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.860568 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bac29c51-c815-4827-bd3d-c74f2e31f842-service-ca-bundle\") pod \"authentication-operator-7f5c659b84-4vmb4\" (UID: \"bac29c51-c815-4827-bd3d-c74f2e31f842\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.860586 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f51f3b4-6887-42b5-ad77-5a2f349a162a-serving-cert\") pod \"route-controller-manager-776cdc94d6-fw5k5\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.860603 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-encryption-config\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.860618 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-tmp\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.860614 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-node-pullsecrets\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.861117 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f8b3ade2-2521-43a2-a5fc-2c33d19f3a58-serviceca\") pod \"image-pruner-29568960-tjk88\" (UID: \"f8b3ade2-2521-43a2-a5fc-2c33d19f3a58\") " pod="openshift-image-registry/image-pruner-29568960-tjk88" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.861157 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-image-import-ca\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.861534 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29568960-tjk88"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.861560 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64d44f6ddf-9g5sg"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.861577 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-799b87ffcd-42tp2"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.861591 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.861604 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66587d64c8-zwkhp"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.861618 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m65lb"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.861661 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bac29c51-c815-4827-bd3d-c74f2e31f842-config\") pod \"authentication-operator-7f5c659b84-4vmb4\" (UID: \"bac29c51-c815-4827-bd3d-c74f2e31f842\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.861666 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-proxy-ca-bundles\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.861801 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-74545575db-z8df4" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.861854 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/bd9a9e58-4363-4eaf-a27c-21aacaeea0a4-machine-approver-tls\") pod \"machine-approver-54c688565-5drp9\" (UID: \"bd9a9e58-4363-4eaf-a27c-21aacaeea0a4\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.861971 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-serving-cert\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.861990 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-client-ca\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.862008 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-audit\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.862064 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-etcd-serving-ca\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.862082 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bac29c51-c815-4827-bd3d-c74f2e31f842-trusted-ca-bundle\") pod \"authentication-operator-7f5c659b84-4vmb4\" (UID: \"bac29c51-c815-4827-bd3d-c74f2e31f842\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.862104 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-etcd-client\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.862119 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-trusted-ca-bundle\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.862147 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a442ae21-7eff-4990-998f-27afcb839a6c-config\") pod \"openshift-apiserver-operator-846cbfc458-vdh4n\" (UID: \"a442ae21-7eff-4990-998f-27afcb839a6c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-vdh4n" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.862225 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-tmp\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.862244 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hbrfl\" (UniqueName: \"kubernetes.io/projected/a442ae21-7eff-4990-998f-27afcb839a6c-kube-api-access-hbrfl\") pod \"openshift-apiserver-operator-846cbfc458-vdh4n\" (UID: \"a442ae21-7eff-4990-998f-27afcb839a6c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-vdh4n" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.862679 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bac29c51-c815-4827-bd3d-c74f2e31f842-service-ca-bundle\") pod \"authentication-operator-7f5c659b84-4vmb4\" (UID: \"bac29c51-c815-4827-bd3d-c74f2e31f842\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.862771 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-audit\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.863807 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-config\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.863869 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-etcd-serving-ca\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864432 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bac29c51-c815-4827-bd3d-c74f2e31f842-trusted-ca-bundle\") pod \"authentication-operator-7f5c659b84-4vmb4\" (UID: \"bac29c51-c815-4827-bd3d-c74f2e31f842\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864480 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-trusted-ca-bundle\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864521 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9648k\" (UniqueName: \"kubernetes.io/projected/bac29c51-c815-4827-bd3d-c74f2e31f842-kube-api-access-9648k\") pod \"authentication-operator-7f5c659b84-4vmb4\" (UID: \"bac29c51-c815-4827-bd3d-c74f2e31f842\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864549 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f51f3b4-6887-42b5-ad77-5a2f349a162a-config\") pod \"route-controller-manager-776cdc94d6-fw5k5\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864574 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-etcd-serving-ca\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864590 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7xbb\" (UniqueName: \"kubernetes.io/projected/9884d9ba-fbeb-40db-8105-de302262478b-kube-api-access-c7xbb\") pod \"machine-api-operator-755bb95488-w2nq2\" (UID: \"9884d9ba-fbeb-40db-8105-de302262478b\") " pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864608 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4pngr\" (UniqueName: \"kubernetes.io/projected/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-kube-api-access-4pngr\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864631 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-config\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864648 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bd9a9e58-4363-4eaf-a27c-21aacaeea0a4-auth-proxy-config\") pod \"machine-approver-54c688565-5drp9\" (UID: \"bd9a9e58-4363-4eaf-a27c-21aacaeea0a4\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864664 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qp728\" (UniqueName: \"kubernetes.io/projected/5f51f3b4-6887-42b5-ad77-5a2f349a162a-kube-api-access-qp728\") pod \"route-controller-manager-776cdc94d6-fw5k5\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864682 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b9lrd\" (UniqueName: \"kubernetes.io/projected/f8b3ade2-2521-43a2-a5fc-2c33d19f3a58-kube-api-access-b9lrd\") pod \"image-pruner-29568960-tjk88\" (UID: \"f8b3ade2-2521-43a2-a5fc-2c33d19f3a58\") " pod="openshift-image-registry/image-pruner-29568960-tjk88" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864705 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-serving-cert\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864721 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd9a9e58-4363-4eaf-a27c-21aacaeea0a4-config\") pod \"machine-approver-54c688565-5drp9\" (UID: \"bd9a9e58-4363-4eaf-a27c-21aacaeea0a4\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864740 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-serving-cert\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864756 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9884d9ba-fbeb-40db-8105-de302262478b-config\") pod \"machine-api-operator-755bb95488-w2nq2\" (UID: \"9884d9ba-fbeb-40db-8105-de302262478b\") " pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864773 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tq7tp\" (UniqueName: \"kubernetes.io/projected/bd9a9e58-4363-4eaf-a27c-21aacaeea0a4-kube-api-access-tq7tp\") pod \"machine-approver-54c688565-5drp9\" (UID: \"bd9a9e58-4363-4eaf-a27c-21aacaeea0a4\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864802 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-etcd-client\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864829 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrpct\" (UniqueName: \"kubernetes.io/projected/45fa64e1-27bb-4b1f-bf62-4fa08b5dcfa0-kube-api-access-xrpct\") pod \"downloads-747b44746d-cb5p2\" (UID: \"45fa64e1-27bb-4b1f-bf62-4fa08b5dcfa0\") " pod="openshift-console/downloads-747b44746d-cb5p2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864855 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-encryption-config\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864876 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-trusted-ca-bundle\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864893 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9884d9ba-fbeb-40db-8105-de302262478b-images\") pod \"machine-api-operator-755bb95488-w2nq2\" (UID: \"9884d9ba-fbeb-40db-8105-de302262478b\") " pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864909 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-audit-policies\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864936 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-audit-dir\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864955 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bzqwb\" (UniqueName: \"kubernetes.io/projected/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-kube-api-access-bzqwb\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864982 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9884d9ba-fbeb-40db-8105-de302262478b-machine-api-operator-tls\") pod \"machine-api-operator-755bb95488-w2nq2\" (UID: \"9884d9ba-fbeb-40db-8105-de302262478b\") " pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.865013 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a442ae21-7eff-4990-998f-27afcb839a6c-serving-cert\") pod \"openshift-apiserver-operator-846cbfc458-vdh4n\" (UID: \"a442ae21-7eff-4990-998f-27afcb839a6c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-vdh4n" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.865029 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5f51f3b4-6887-42b5-ad77-5a2f349a162a-client-ca\") pod \"route-controller-manager-776cdc94d6-fw5k5\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.865044 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5f51f3b4-6887-42b5-ad77-5a2f349a162a-tmp\") pod \"route-controller-manager-776cdc94d6-fw5k5\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.865138 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a442ae21-7eff-4990-998f-27afcb839a6c-config\") pod \"openshift-apiserver-operator-846cbfc458-vdh4n\" (UID: \"a442ae21-7eff-4990-998f-27afcb839a6c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-vdh4n" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.865458 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5f51f3b4-6887-42b5-ad77-5a2f349a162a-tmp\") pod \"route-controller-manager-776cdc94d6-fw5k5\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.865866 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-audit-dir\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.866531 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bd9a9e58-4363-4eaf-a27c-21aacaeea0a4-auth-proxy-config\") pod \"machine-approver-54c688565-5drp9\" (UID: \"bd9a9e58-4363-4eaf-a27c-21aacaeea0a4\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.867158 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5f51f3b4-6887-42b5-ad77-5a2f349a162a-client-ca\") pod \"route-controller-manager-776cdc94d6-fw5k5\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.867158 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bac29c51-c815-4827-bd3d-c74f2e31f842-serving-cert\") pod \"authentication-operator-7f5c659b84-4vmb4\" (UID: \"bac29c51-c815-4827-bd3d-c74f2e31f842\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.867197 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-trusted-ca-bundle\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.867328 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9884d9ba-fbeb-40db-8105-de302262478b-config\") pod \"machine-api-operator-755bb95488-w2nq2\" (UID: \"9884d9ba-fbeb-40db-8105-de302262478b\") " pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.867602 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-audit-policies\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.867804 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f51f3b4-6887-42b5-ad77-5a2f349a162a-config\") pod \"route-controller-manager-776cdc94d6-fw5k5\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.868009 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-encryption-config\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.868324 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-etcd-serving-ca\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.868755 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-config\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.870751 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd9a9e58-4363-4eaf-a27c-21aacaeea0a4-config\") pod \"machine-approver-54c688565-5drp9\" (UID: \"bd9a9e58-4363-4eaf-a27c-21aacaeea0a4\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.871134 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9884d9ba-fbeb-40db-8105-de302262478b-images\") pod \"machine-api-operator-755bb95488-w2nq2\" (UID: \"9884d9ba-fbeb-40db-8105-de302262478b\") " pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.871546 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-serving-cert\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.871749 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-serving-cert\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.871755 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-encryption-config\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.872139 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f51f3b4-6887-42b5-ad77-5a2f349a162a-serving-cert\") pod \"route-controller-manager-776cdc94d6-fw5k5\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.872538 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-client-ca\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.872840 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/bd9a9e58-4363-4eaf-a27c-21aacaeea0a4-machine-approver-tls\") pod \"machine-approver-54c688565-5drp9\" (UID: \"bd9a9e58-4363-4eaf-a27c-21aacaeea0a4\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.872942 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a442ae21-7eff-4990-998f-27afcb839a6c-serving-cert\") pod \"openshift-apiserver-operator-846cbfc458-vdh4n\" (UID: \"a442ae21-7eff-4990-998f-27afcb839a6c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-vdh4n" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.873016 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-etcd-client\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.873201 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-etcd-client\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.873682 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.875050 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9884d9ba-fbeb-40db-8105-de302262478b-machine-api-operator-tls\") pod \"machine-api-operator-755bb95488-w2nq2\" (UID: \"9884d9ba-fbeb-40db-8105-de302262478b\") " pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.875327 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-serving-cert\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.876545 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-ms24k"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.876585 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-5b9c976747-zvprn"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.876692 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m65lb" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.880800 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.896657 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-lf2zm"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.896795 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-zvprn" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.899819 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"cluster-image-registry-operator-dockercfg-ntnd7\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.902807 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.902832 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.902845 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.902855 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-tlh4v"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.902866 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-755bb95488-w2nq2"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.902883 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-r75lk"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.902980 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.909781 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-55dml"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.909983 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-r75lk" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.913432 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.913561 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-55dml" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.917159 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-mzck5"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.917366 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.919039 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-operator-tls\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.922329 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.922359 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-866fcbc849-m5rc6"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.922370 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-67c89758df-vnd4f"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.922379 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-57nbs"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.922450 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.929921 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-74545575db-z8df4"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.929947 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.929957 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-69db94689b-bkst6"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.929968 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-rkswl"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.930065 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.934802 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.934868 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-66458b6674-8qfhd"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.934885 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.934898 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m65lb"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.934915 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.934928 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-lf2zm"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.934940 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.934941 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rkswl" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.934952 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.934967 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-mzck5"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.934983 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rkswl"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.934993 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.935011 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-55dml"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.935023 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-5b9c976747-zvprn"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.939584 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6w67b\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.959054 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.980377 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns-operator\"/\"dns-operator-dockercfg-wbbsn\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.999703 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns-operator\"/\"kube-root-ca.crt\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.019136 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns-operator\"/\"metrics-tls\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.039310 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.059241 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-kw8fx\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.080473 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-certs-default\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.099622 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.119897 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.139688 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.159386 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.179990 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.200465 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.219580 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-operator\"/\"ingress-operator-dockercfg-74nwh\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.240599 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-operator\"/\"metrics-tls\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.267149 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-operator\"/\"trusted-ca\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.279084 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-operator\"/\"kube-root-ca.crt\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.299927 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-apiserver-operator\"/\"kube-apiserver-operator-dockercfg-bf7fj\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.319964 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-apiserver-operator\"/\"kube-apiserver-operator-config\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.340253 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-apiserver-operator\"/\"kube-apiserver-operator-serving-cert\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.360590 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-apiserver-operator\"/\"kube-root-ca.crt\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.379809 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-controller-manager-operator\"/\"kube-controller-manager-operator-serving-cert\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.400602 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-controller-manager-operator\"/\"kube-root-ca.crt\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.420391 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-controller-manager-operator\"/\"kube-controller-manager-operator-dockercfg-tnfx9\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.439480 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-controller-manager-operator\"/\"kube-controller-manager-operator-config\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.459462 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-scheduler-operator\"/\"openshift-kube-scheduler-operator-dockercfg-2wbn2\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.474089 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.474151 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwt8b\" (UniqueName: \"kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b\") pod \"network-check-target-fhkjl\" (UID: \"17b87002-b798-480a-8e17-83053d698239\") " pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.474232 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7w75\" (UniqueName: \"kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75\") pod \"network-check-source-5bb8f5cd97-xdvz5\" (UID: \"f863fff9-286a-45fa-b8f0-8a86994b8440\") " pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.474260 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:38 crc kubenswrapper[5116]: E0322 00:10:38.474267 5116 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 22 00:10:38 crc kubenswrapper[5116]: E0322 00:10:38.474406 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.474376777 +0000 UTC m=+125.496678150 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 22 00:10:38 crc kubenswrapper[5116]: E0322 00:10:38.474448 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 22 00:10:38 crc kubenswrapper[5116]: E0322 00:10:38.474480 5116 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 22 00:10:38 crc kubenswrapper[5116]: E0322 00:10:38.474583 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.474563583 +0000 UTC m=+125.496864956 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 22 00:10:38 crc kubenswrapper[5116]: E0322 00:10:38.474486 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 22 00:10:38 crc kubenswrapper[5116]: E0322 00:10:38.474627 5116 projected.go:194] Error preparing data for projected volume kube-api-access-gwt8b for pod openshift-network-diagnostics/network-check-target-fhkjl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:38 crc kubenswrapper[5116]: E0322 00:10:38.474660 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b podName:17b87002-b798-480a-8e17-83053d698239 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.474653996 +0000 UTC m=+125.496955389 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-gwt8b" (UniqueName: "kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b") pod "network-check-target-fhkjl" (UID: "17b87002-b798-480a-8e17-83053d698239") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:38 crc kubenswrapper[5116]: E0322 00:10:38.474466 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 22 00:10:38 crc kubenswrapper[5116]: E0322 00:10:38.474690 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 22 00:10:38 crc kubenswrapper[5116]: E0322 00:10:38.474704 5116 projected.go:194] Error preparing data for projected volume kube-api-access-l7w75 for pod openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:38 crc kubenswrapper[5116]: E0322 00:10:38.474739 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75 podName:f863fff9-286a-45fa-b8f0-8a86994b8440 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.474733109 +0000 UTC m=+125.497034482 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-l7w75" (UniqueName: "kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75") pod "network-check-source-5bb8f5cd97-xdvz5" (UID: "f863fff9-286a-45fa-b8f0-8a86994b8440") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.480355 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-scheduler-operator\"/\"openshift-kube-scheduler-operator-config\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.499570 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-scheduler-operator\"/\"kube-root-ca.crt\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.519556 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-scheduler-operator\"/\"kube-scheduler-operator-serving-cert\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.539310 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"olm-operator-serviceaccount-dockercfg-4gqzj\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.559122 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"kube-root-ca.crt\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.575714 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:38 crc kubenswrapper[5116]: E0322 00:10:38.576097 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.576059824 +0000 UTC m=+125.598361247 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.579374 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"openshift-service-ca.crt\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.600556 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"package-server-manager-serving-cert\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.619699 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-admission-controller-secret\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.640344 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ac-dockercfg-gj7jx\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.659699 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-2h6bs\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.677723 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs\") pod \"network-metrics-daemon-wlq8c\" (UID: \"94c19a90-c2c9-4236-98be-a0516dbb840b\") " pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.679294 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.683189 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs\") pod \"network-metrics-daemon-wlq8c\" (UID: \"94c19a90-c2c9-4236-98be-a0516dbb840b\") " pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.696568 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.696739 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.696768 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.699940 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.720362 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.740565 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.779555 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"mcc-proxy-tls\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.792072 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.800699 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-controller-dockercfg-xnj77\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.819660 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"mco-proxy-tls\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.837669 5116 request.go:752] "Waited before sending request" delay="1.008693167s" reason="client-side throttling, not priority and fairness" verb="GET" URL="https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/configmaps?fieldSelector=metadata.name%3Dmachine-config-operator-images&limit=500&resourceVersion=0" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.839493 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-operator-images\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.858801 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-operator-dockercfg-sw6nc\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.880253 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"etcd-service-ca-bundle\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.900676 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-etcd-operator\"/\"etcd-operator-serving-cert\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.919672 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"etcd-ca-bundle\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.940357 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-etcd-operator\"/\"etcd-client\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.960597 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"kube-root-ca.crt\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.981017 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-etcd-operator\"/\"etcd-operator-dockercfg-4vdnc\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.999668 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.019383 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"etcd-operator-config\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.039283 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-kknhg\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.059326 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.080486 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.120138 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"catalog-operator-serving-cert\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.139458 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"pprof-cert\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.160525 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"olm-operator-serving-cert\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.179765 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"packageserver-service-cert\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.183991 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: E0322 00:10:39.184686 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:39.684662566 +0000 UTC m=+110.706963979 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.185189 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/413bd8dc-5257-4fd7-95c1-01f6d79278ee-config\") pod \"openshift-controller-manager-operator-686468bdd5-pdqfv\" (UID: \"413bd8dc-5257-4fd7-95c1-01f6d79278ee\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.185230 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/413bd8dc-5257-4fd7-95c1-01f6d79278ee-tmp\") pod \"openshift-controller-manager-operator-686468bdd5-pdqfv\" (UID: \"413bd8dc-5257-4fd7-95c1-01f6d79278ee\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.185255 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/720e1d69-81b3-4fdb-94c1-dabb0707c833-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.185281 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.185309 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-registry-tls\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.185332 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c2755ce-817d-47b0-9f19-7218641d0c5b-trusted-ca-bundle\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.185459 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4117709-89bd-4e72-8016-0c25c0ece2c6-trusted-ca\") pod \"console-operator-67c89758df-vnd4f\" (UID: \"b4117709-89bd-4e72-8016-0c25c0ece2c6\") " pod="openshift-console-operator/console-operator-67c89758df-vnd4f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.185537 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-session\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.185597 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-error\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.185681 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4c2755ce-817d-47b0-9f19-7218641d0c5b-service-ca\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.185728 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qljht\" (UniqueName: \"kubernetes.io/projected/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-kube-api-access-qljht\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.185820 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-bound-sa-token\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.185862 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkpc6\" (UniqueName: \"kubernetes.io/projected/44b1188c-0fa6-48c7-bf76-6e65ca8174ec-kube-api-access-xkpc6\") pod \"openshift-config-operator-5777786469-wb6r8\" (UID: \"44b1188c-0fa6-48c7-bf76-6e65ca8174ec\") " pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.185909 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv46d\" (UniqueName: \"kubernetes.io/projected/413bd8dc-5257-4fd7-95c1-01f6d79278ee-kube-api-access-tv46d\") pod \"openshift-controller-manager-operator-686468bdd5-pdqfv\" (UID: \"413bd8dc-5257-4fd7-95c1-01f6d79278ee\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.185957 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-audit-policies\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.186002 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4117709-89bd-4e72-8016-0c25c0ece2c6-serving-cert\") pod \"console-operator-67c89758df-vnd4f\" (UID: \"b4117709-89bd-4e72-8016-0c25c0ece2c6\") " pod="openshift-console-operator/console-operator-67c89758df-vnd4f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.186046 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95r42\" (UniqueName: \"kubernetes.io/projected/b4117709-89bd-4e72-8016-0c25c0ece2c6-kube-api-access-95r42\") pod \"console-operator-67c89758df-vnd4f\" (UID: \"b4117709-89bd-4e72-8016-0c25c0ece2c6\") " pod="openshift-console-operator/console-operator-67c89758df-vnd4f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.186091 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-router-certs\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.187614 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.187692 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.188317 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/720e1d69-81b3-4fdb-94c1-dabb0707c833-trusted-ca\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.188395 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtlfq\" (UniqueName: \"kubernetes.io/projected/3bf5ae18-6e08-436b-939f-03347eda68a8-kube-api-access-wtlfq\") pod \"cluster-samples-operator-6b564684c8-dfx6t\" (UID: \"3bf5ae18-6e08-436b-939f-03347eda68a8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-dfx6t" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.188430 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqc8c\" (UniqueName: \"kubernetes.io/projected/720e1d69-81b3-4fdb-94c1-dabb0707c833-kube-api-access-zqc8c\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.188503 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-ca-trust-extracted\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.188596 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4c2755ce-817d-47b0-9f19-7218641d0c5b-console-config\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.188685 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/413bd8dc-5257-4fd7-95c1-01f6d79278ee-serving-cert\") pod \"openshift-controller-manager-operator-686468bdd5-pdqfv\" (UID: \"413bd8dc-5257-4fd7-95c1-01f6d79278ee\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.188770 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/44b1188c-0fa6-48c7-bf76-6e65ca8174ec-available-featuregates\") pod \"openshift-config-operator-5777786469-wb6r8\" (UID: \"44b1188c-0fa6-48c7-bf76-6e65ca8174ec\") " pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.188846 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c2755ce-817d-47b0-9f19-7218641d0c5b-console-serving-cert\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.188894 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-service-ca\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.188937 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/720e1d69-81b3-4fdb-94c1-dabb0707c833-tmp\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.188966 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-trusted-ca\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.189093 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-installation-pull-secrets\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.189123 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bf5ae18-6e08-436b-939f-03347eda68a8-samples-operator-tls\") pod \"cluster-samples-operator-6b564684c8-dfx6t\" (UID: \"3bf5ae18-6e08-436b-939f-03347eda68a8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-dfx6t" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.189152 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsw9z\" (UniqueName: \"kubernetes.io/projected/4c2755ce-817d-47b0-9f19-7218641d0c5b-kube-api-access-zsw9z\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.189211 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted-pem\" (UniqueName: \"kubernetes.io/empty-dir/720e1d69-81b3-4fdb-94c1-dabb0707c833-ca-trust-extracted-pem\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.189247 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4c2755ce-817d-47b0-9f19-7218641d0c5b-oauth-serving-cert\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.189276 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/720e1d69-81b3-4fdb-94c1-dabb0707c833-bound-sa-token\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.189306 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.189343 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44b1188c-0fa6-48c7-bf76-6e65ca8174ec-serving-cert\") pod \"openshift-config-operator-5777786469-wb6r8\" (UID: \"44b1188c-0fa6-48c7-bf76-6e65ca8174ec\") " pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.189371 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-audit-dir\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.189402 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.189435 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.189470 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-login\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.189505 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl2cn\" (UniqueName: \"kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-kube-api-access-hl2cn\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.189533 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4117709-89bd-4e72-8016-0c25c0ece2c6-config\") pod \"console-operator-67c89758df-vnd4f\" (UID: \"b4117709-89bd-4e72-8016-0c25c0ece2c6\") " pod="openshift-console-operator/console-operator-67c89758df-vnd4f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.189569 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4c2755ce-817d-47b0-9f19-7218641d0c5b-console-oauth-config\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.189606 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-registry-certificates\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.216991 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvntc\" (UniqueName: \"kubernetes.io/projected/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-kube-api-access-wvntc\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.219739 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.226880 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wlq8c"] Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.241732 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.259994 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.290596 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:39 crc kubenswrapper[5116]: E0322 00:10:39.290815 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:39.790786273 +0000 UTC m=+110.813087646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.290900 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hl2cn\" (UniqueName: \"kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-kube-api-access-hl2cn\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.290940 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/1bb9e03f-ef85-4dbe-802f-d529e97b092c-csi-data-dir\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.290961 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a1258288-8146-4cba-9d66-2a88e35a1fe9-srv-cert\") pod \"catalog-operator-75ff9f647d-47j6l\" (UID: \"a1258288-8146-4cba-9d66-2a88e35a1fe9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.290987 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-57nbs\" (UID: \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291080 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-config\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291244 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291276 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/dc9230ca-3a51-4ee3-976c-38c27605db87-node-bootstrap-token\") pod \"machine-config-server-r75lk\" (UID: \"dc9230ca-3a51-4ee3-976c-38c27605db87\") " pod="openshift-machine-config-operator/machine-config-server-r75lk" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291294 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/56318568-cab8-4d5b-9a20-4531fc8aad60-kube-api-access\") pod \"openshift-kube-scheduler-operator-54f497555d-lmct6\" (UID: \"56318568-cab8-4d5b-9a20-4531fc8aad60\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291311 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psgc4\" (UniqueName: \"kubernetes.io/projected/50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee-kube-api-access-psgc4\") pod \"machine-config-operator-67c9d58cbb-l9f6f\" (UID: \"50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291328 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2512f5ef-a611-4637-b41f-41185def421b-serving-cert\") pod \"kube-controller-manager-operator-69d5f845f8-sk5s4\" (UID: \"2512f5ef-a611-4637-b41f-41185def421b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291459 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/413bd8dc-5257-4fd7-95c1-01f6d79278ee-config\") pod \"openshift-controller-manager-operator-686468bdd5-pdqfv\" (UID: \"413bd8dc-5257-4fd7-95c1-01f6d79278ee\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291534 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/413bd8dc-5257-4fd7-95c1-01f6d79278ee-tmp\") pod \"openshift-controller-manager-operator-686468bdd5-pdqfv\" (UID: \"413bd8dc-5257-4fd7-95c1-01f6d79278ee\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291596 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/56318568-cab8-4d5b-9a20-4531fc8aad60-tmp\") pod \"openshift-kube-scheduler-operator-54f497555d-lmct6\" (UID: \"56318568-cab8-4d5b-9a20-4531fc8aad60\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291651 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prql2\" (UniqueName: \"kubernetes.io/projected/0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a-kube-api-access-prql2\") pod \"dns-default-rkswl\" (UID: \"0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a\") " pod="openshift-dns/dns-default-rkswl" Mar 22 00:10:39 crc kubenswrapper[5116]: E0322 00:10:39.291688 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:39.791666811 +0000 UTC m=+110.813968344 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291759 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/720e1d69-81b3-4fdb-94c1-dabb0707c833-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291790 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291822 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-registry-tls\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291845 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4117709-89bd-4e72-8016-0c25c0ece2c6-trusted-ca\") pod \"console-operator-67c89758df-vnd4f\" (UID: \"b4117709-89bd-4e72-8016-0c25c0ece2c6\") " pod="openshift-console-operator/console-operator-67c89758df-vnd4f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291866 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-session\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291888 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-error\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291914 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56318568-cab8-4d5b-9a20-4531fc8aad60-config\") pod \"openshift-kube-scheduler-operator-54f497555d-lmct6\" (UID: \"56318568-cab8-4d5b-9a20-4531fc8aad60\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291939 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qljht\" (UniqueName: \"kubernetes.io/projected/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-kube-api-access-qljht\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291963 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fe48c9c2-8783-475b-a961-d5a4110cb452-srv-cert\") pod \"olm-operator-5cdf44d969-cp4p2\" (UID: \"fe48c9c2-8783-475b-a961-d5a4110cb452\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291981 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-ready\") pod \"cni-sysctl-allowlist-ds-57nbs\" (UID: \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292006 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-marketplace-trusted-ca\") pod \"marketplace-operator-547dbd544d-lf2zm\" (UID: \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292038 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xkpc6\" (UniqueName: \"kubernetes.io/projected/44b1188c-0fa6-48c7-bf76-6e65ca8174ec-kube-api-access-xkpc6\") pod \"openshift-config-operator-5777786469-wb6r8\" (UID: \"44b1188c-0fa6-48c7-bf76-6e65ca8174ec\") " pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292055 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-audit-policies\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292071 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/413bd8dc-5257-4fd7-95c1-01f6d79278ee-config\") pod \"openshift-controller-manager-operator-686468bdd5-pdqfv\" (UID: \"413bd8dc-5257-4fd7-95c1-01f6d79278ee\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292070 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fe48c9c2-8783-475b-a961-d5a4110cb452-profile-collector-cert\") pod \"olm-operator-5cdf44d969-cp4p2\" (UID: \"fe48c9c2-8783-475b-a961-d5a4110cb452\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292097 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttvwk\" (UniqueName: \"kubernetes.io/projected/c7b12ebb-b568-4d15-abde-14db5041d5d2-kube-api-access-ttvwk\") pod \"ingress-operator-6b9cb4dbcf-qdtdp\" (UID: \"c7b12ebb-b568-4d15-abde-14db5041d5d2\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292114 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkbsf\" (UniqueName: \"kubernetes.io/projected/bd4c0dd5-6c39-4661-b0c6-424d6b061f04-kube-api-access-fkbsf\") pod \"service-ca-74545575db-z8df4\" (UID: \"bd4c0dd5-6c39-4661-b0c6-424d6b061f04\") " pod="openshift-service-ca/service-ca-74545575db-z8df4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292130 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lck5k\" (UniqueName: \"kubernetes.io/projected/a1258288-8146-4cba-9d66-2a88e35a1fe9-kube-api-access-lck5k\") pod \"catalog-operator-75ff9f647d-47j6l\" (UID: \"a1258288-8146-4cba-9d66-2a88e35a1fe9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292160 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/413bd8dc-5257-4fd7-95c1-01f6d79278ee-tmp\") pod \"openshift-controller-manager-operator-686468bdd5-pdqfv\" (UID: \"413bd8dc-5257-4fd7-95c1-01f6d79278ee\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292324 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/65b5ebf3-054c-4827-96b2-7ea0a26f20af-tmp-dir\") pod \"dns-operator-799b87ffcd-42tp2\" (UID: \"65b5ebf3-054c-4827-96b2-7ea0a26f20af\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-42tp2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292389 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee-auth-proxy-config\") pod \"machine-config-operator-67c9d58cbb-l9f6f\" (UID: \"50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292427 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6g2r\" (UniqueName: \"kubernetes.io/projected/73928a0d-7a97-4c03-a5e8-6ab37119261c-kube-api-access-p6g2r\") pod \"migrator-866fcbc849-m5rc6\" (UID: \"73928a0d-7a97-4c03-a5e8-6ab37119261c\") " pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-m5rc6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292457 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-serving-cert\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292495 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292535 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/720e1d69-81b3-4fdb-94c1-dabb0707c833-trusted-ca\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292571 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xgch\" (UniqueName: \"kubernetes.io/projected/5df5e2ea-4661-49c7-95f2-d8c039bbea5d-kube-api-access-5xgch\") pod \"service-ca-operator-5b9c976747-zvprn\" (UID: \"5df5e2ea-4661-49c7-95f2-d8c039bbea5d\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-zvprn" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292603 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c7b12ebb-b568-4d15-abde-14db5041d5d2-metrics-tls\") pod \"ingress-operator-6b9cb4dbcf-qdtdp\" (UID: \"c7b12ebb-b568-4d15-abde-14db5041d5d2\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292643 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0910a8a1-0226-42c8-ab1d-b142d2b7a00d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-75ffdb6fcd-m65lb\" (UID: \"0910a8a1-0226-42c8-ab1d-b142d2b7a00d\") " pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m65lb" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292679 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-ca-trust-extracted\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292716 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c7b12ebb-b568-4d15-abde-14db5041d5d2-bound-sa-token\") pod \"ingress-operator-6b9cb4dbcf-qdtdp\" (UID: \"c7b12ebb-b568-4d15-abde-14db5041d5d2\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292755 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/413bd8dc-5257-4fd7-95c1-01f6d79278ee-serving-cert\") pod \"openshift-controller-manager-operator-686468bdd5-pdqfv\" (UID: \"413bd8dc-5257-4fd7-95c1-01f6d79278ee\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292785 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5df5e2ea-4661-49c7-95f2-d8c039bbea5d-serving-cert\") pod \"service-ca-operator-5b9c976747-zvprn\" (UID: \"5df5e2ea-4661-49c7-95f2-d8c039bbea5d\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-zvprn" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292816 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mkbf\" (UniqueName: \"kubernetes.io/projected/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-kube-api-access-7mkbf\") pod \"cni-sysctl-allowlist-ds-57nbs\" (UID: \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292852 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c2755ce-817d-47b0-9f19-7218641d0c5b-console-serving-cert\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292884 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/bd4c0dd5-6c39-4661-b0c6-424d6b061f04-signing-cabundle\") pod \"service-ca-74545575db-z8df4\" (UID: \"bd4c0dd5-6c39-4661-b0c6-424d6b061f04\") " pod="openshift-service-ca/service-ca-74545575db-z8df4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292916 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a224637e-e693-4ae7-89c3-1a01e6c9a6f5-webhook-cert\") pod \"packageserver-7d4fc7d867-td5gr\" (UID: \"a224637e-e693-4ae7-89c3-1a01e6c9a6f5\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292953 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-service-ca\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292990 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/720e1d69-81b3-4fdb-94c1-dabb0707c833-tmp\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.293017 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-audit-policies\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.293025 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93e67bda-2839-4364-9a75-54864090dc1f-config\") pod \"kube-storage-version-migrator-operator-565b79b866-tlh4v\" (UID: \"93e67bda-2839-4364-9a75-54864090dc1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-tlh4v" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.293058 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee-images\") pod \"machine-config-operator-67c9d58cbb-l9f6f\" (UID: \"50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.293090 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a1258288-8146-4cba-9d66-2a88e35a1fe9-profile-collector-cert\") pod \"catalog-operator-75ff9f647d-47j6l\" (UID: \"a1258288-8146-4cba-9d66-2a88e35a1fe9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.293205 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-trusted-ca\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.293635 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.293687 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/00dffd10-d567-431f-8dd9-390443f26d96-stats-auth\") pod \"router-default-68cf44c8b8-2jlxw\" (UID: \"00dffd10-d567-431f-8dd9-390443f26d96\") " pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.293724 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cf866f04-6739-40da-8c1c-36d192472220-tmp-dir\") pod \"kube-apiserver-operator-575994946d-jxbnl\" (UID: \"cf866f04-6739-40da-8c1c-36d192472220\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.293748 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2512f5ef-a611-4637-b41f-41185def421b-tmp-dir\") pod \"kube-controller-manager-operator-69d5f845f8-sk5s4\" (UID: \"2512f5ef-a611-4637-b41f-41185def421b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.293769 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-etcd-client\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.293957 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.294391 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4117709-89bd-4e72-8016-0c25c0ece2c6-trusted-ca\") pod \"console-operator-67c89758df-vnd4f\" (UID: \"b4117709-89bd-4e72-8016-0c25c0ece2c6\") " pod="openshift-console-operator/console-operator-67c89758df-vnd4f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.294442 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56318568-cab8-4d5b-9a20-4531fc8aad60-serving-cert\") pod \"openshift-kube-scheduler-operator-54f497555d-lmct6\" (UID: \"56318568-cab8-4d5b-9a20-4531fc8aad60\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.294518 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-ca-trust-extracted\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.294803 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-installation-pull-secrets\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.294889 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zsw9z\" (UniqueName: \"kubernetes.io/projected/4c2755ce-817d-47b0-9f19-7218641d0c5b-kube-api-access-zsw9z\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.295045 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/1bb9e03f-ef85-4dbe-802f-d529e97b092c-plugins-dir\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.295081 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0eea7869-af21-4009-856f-65219d64ceea-proxy-tls\") pod \"machine-config-controller-f9cdd68f7-wfffg\" (UID: \"0eea7869-af21-4009-856f-65219d64ceea\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.295213 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-57nbs\" (UID: \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.295244 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.295264 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/720e1d69-81b3-4fdb-94c1-dabb0707c833-bound-sa-token\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.295309 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/dc9230ca-3a51-4ee3-976c-38c27605db87-certs\") pod \"machine-config-server-r75lk\" (UID: \"dc9230ca-3a51-4ee3-976c-38c27605db87\") " pod="openshift-machine-config-operator/machine-config-server-r75lk" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.295362 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.295397 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2512f5ef-a611-4637-b41f-41185def421b-config\") pod \"kube-controller-manager-operator-69d5f845f8-sk5s4\" (UID: \"2512f5ef-a611-4637-b41f-41185def421b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.295515 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1bb9e03f-ef85-4dbe-802f-d529e97b092c-socket-dir\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.295549 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4fhd\" (UniqueName: \"kubernetes.io/projected/1bb9e03f-ef85-4dbe-802f-d529e97b092c-kube-api-access-h4fhd\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.295618 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crwp9\" (UniqueName: \"kubernetes.io/projected/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-kube-api-access-crwp9\") pod \"marketplace-operator-547dbd544d-lf2zm\" (UID: \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.295776 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-service-ca\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.295878 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4117709-89bd-4e72-8016-0c25c0ece2c6-config\") pod \"console-operator-67c89758df-vnd4f\" (UID: \"b4117709-89bd-4e72-8016-0c25c0ece2c6\") " pod="openshift-console-operator/console-operator-67c89758df-vnd4f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.296196 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf866f04-6739-40da-8c1c-36d192472220-kube-api-access\") pod \"kube-apiserver-operator-575994946d-jxbnl\" (UID: \"cf866f04-6739-40da-8c1c-36d192472220\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.296277 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee-proxy-tls\") pod \"machine-config-operator-67c9d58cbb-l9f6f\" (UID: \"50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.296329 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg9xh\" (UniqueName: \"kubernetes.io/projected/65b5ebf3-054c-4827-96b2-7ea0a26f20af-kube-api-access-sg9xh\") pod \"dns-operator-799b87ffcd-42tp2\" (UID: \"65b5ebf3-054c-4827-96b2-7ea0a26f20af\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-42tp2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.296380 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5df5e2ea-4661-49c7-95f2-d8c039bbea5d-config\") pod \"service-ca-operator-5b9c976747-zvprn\" (UID: \"5df5e2ea-4661-49c7-95f2-d8c039bbea5d\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-zvprn" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.296442 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4c2755ce-817d-47b0-9f19-7218641d0c5b-console-oauth-config\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.296492 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-config-volume\") pod \"collect-profiles-29568960-fmcxl\" (UID: \"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.296543 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc0602bb-5acd-426a-b3d6-3a2effb49bf3-package-server-manager-serving-cert\") pod \"package-server-manager-77f986bd66-ms24k\" (UID: \"fc0602bb-5acd-426a-b3d6-3a2effb49bf3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-ms24k" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.296594 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-tmp-dir\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.296651 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-registry-certificates\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.296701 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93e67bda-2839-4364-9a75-54864090dc1f-serving-cert\") pod \"kube-storage-version-migrator-operator-565b79b866-tlh4v\" (UID: \"93e67bda-2839-4364-9a75-54864090dc1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-tlh4v" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.296744 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4117709-89bd-4e72-8016-0c25c0ece2c6-config\") pod \"console-operator-67c89758df-vnd4f\" (UID: \"b4117709-89bd-4e72-8016-0c25c0ece2c6\") " pod="openshift-console-operator/console-operator-67c89758df-vnd4f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.296755 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcvxk\" (UniqueName: \"kubernetes.io/projected/0910a8a1-0226-42c8-ab1d-b142d2b7a00d-kube-api-access-mcvxk\") pod \"control-plane-machine-set-operator-75ffdb6fcd-m65lb\" (UID: \"0910a8a1-0226-42c8-ab1d-b142d2b7a00d\") " pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m65lb" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.296645 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-trusted-ca\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.296909 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-tmp\") pod \"marketplace-operator-547dbd544d-lf2zm\" (UID: \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.296973 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd443947-7241-49e3-9d98-f55329818dcc-webhook-certs\") pod \"multus-admission-controller-69db94689b-bkst6\" (UID: \"bd443947-7241-49e3-9d98-f55329818dcc\") " pod="openshift-multus/multus-admission-controller-69db94689b-bkst6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.297029 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c2755ce-817d-47b0-9f19-7218641d0c5b-trusted-ca-bundle\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.297077 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2512f5ef-a611-4637-b41f-41185def421b-kube-api-access\") pod \"kube-controller-manager-operator-69d5f845f8-sk5s4\" (UID: \"2512f5ef-a611-4637-b41f-41185def421b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.297122 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/bd4c0dd5-6c39-4661-b0c6-424d6b061f04-signing-key\") pod \"service-ca-74545575db-z8df4\" (UID: \"bd4c0dd5-6c39-4661-b0c6-424d6b061f04\") " pod="openshift-service-ca/service-ca-74545575db-z8df4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.297222 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/65b5ebf3-054c-4827-96b2-7ea0a26f20af-metrics-tls\") pod \"dns-operator-799b87ffcd-42tp2\" (UID: \"65b5ebf3-054c-4827-96b2-7ea0a26f20af\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-42tp2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.297339 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4c2755ce-817d-47b0-9f19-7218641d0c5b-service-ca\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.297420 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4117709-89bd-4e72-8016-0c25c0ece2c6-serving-cert\") pod \"console-operator-67c89758df-vnd4f\" (UID: \"b4117709-89bd-4e72-8016-0c25c0ece2c6\") " pod="openshift-console-operator/console-operator-67c89758df-vnd4f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.297476 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-95r42\" (UniqueName: \"kubernetes.io/projected/b4117709-89bd-4e72-8016-0c25c0ece2c6-kube-api-access-95r42\") pod \"console-operator-67c89758df-vnd4f\" (UID: \"b4117709-89bd-4e72-8016-0c25c0ece2c6\") " pod="openshift-console-operator/console-operator-67c89758df-vnd4f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.297525 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-router-certs\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.297605 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rjsf\" (UniqueName: \"kubernetes.io/projected/93e67bda-2839-4364-9a75-54864090dc1f-kube-api-access-4rjsf\") pod \"kube-storage-version-migrator-operator-565b79b866-tlh4v\" (UID: \"93e67bda-2839-4364-9a75-54864090dc1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-tlh4v" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.297669 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-session\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.297747 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-bound-sa-token\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.297812 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a1258288-8146-4cba-9d66-2a88e35a1fe9-tmpfs\") pod \"catalog-operator-75ff9f647d-47j6l\" (UID: \"a1258288-8146-4cba-9d66-2a88e35a1fe9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.297895 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-etcd-service-ca\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.297951 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tv46d\" (UniqueName: \"kubernetes.io/projected/413bd8dc-5257-4fd7-95c1-01f6d79278ee-kube-api-access-tv46d\") pod \"openshift-controller-manager-operator-686468bdd5-pdqfv\" (UID: \"413bd8dc-5257-4fd7-95c1-01f6d79278ee\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.297990 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-registry-certificates\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298009 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/1bb9e03f-ef85-4dbe-802f-d529e97b092c-mountpoint-dir\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298067 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00dffd10-d567-431f-8dd9-390443f26d96-service-ca-bundle\") pod \"router-default-68cf44c8b8-2jlxw\" (UID: \"00dffd10-d567-431f-8dd9-390443f26d96\") " pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298119 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xp2b\" (UniqueName: \"kubernetes.io/projected/dcd47220-17b3-4593-9a12-fb67c0c0dcc8-kube-api-access-6xp2b\") pod \"ingress-canary-55dml\" (UID: \"dcd47220-17b3-4593-9a12-fb67c0c0dcc8\") " pod="openshift-ingress-canary/ingress-canary-55dml" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298148 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vqx7\" (UniqueName: \"kubernetes.io/projected/a224637e-e693-4ae7-89c3-1a01e6c9a6f5-kube-api-access-9vqx7\") pod \"packageserver-7d4fc7d867-td5gr\" (UID: \"a224637e-e693-4ae7-89c3-1a01e6c9a6f5\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298189 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a-tmp-dir\") pod \"dns-default-rkswl\" (UID: \"0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a\") " pod="openshift-dns/dns-default-rkswl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298225 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298247 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzqjt\" (UniqueName: \"kubernetes.io/projected/fc0602bb-5acd-426a-b3d6-3a2effb49bf3-kube-api-access-gzqjt\") pod \"package-server-manager-77f986bd66-ms24k\" (UID: \"fc0602bb-5acd-426a-b3d6-3a2effb49bf3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-ms24k" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298274 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/00dffd10-d567-431f-8dd9-390443f26d96-default-certificate\") pod \"router-default-68cf44c8b8-2jlxw\" (UID: \"00dffd10-d567-431f-8dd9-390443f26d96\") " pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298299 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtlfq\" (UniqueName: \"kubernetes.io/projected/3bf5ae18-6e08-436b-939f-03347eda68a8-kube-api-access-wtlfq\") pod \"cluster-samples-operator-6b564684c8-dfx6t\" (UID: \"3bf5ae18-6e08-436b-939f-03347eda68a8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-dfx6t" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298323 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zqc8c\" (UniqueName: \"kubernetes.io/projected/720e1d69-81b3-4fdb-94c1-dabb0707c833-kube-api-access-zqc8c\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298345 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/00dffd10-d567-431f-8dd9-390443f26d96-metrics-certs\") pod \"router-default-68cf44c8b8-2jlxw\" (UID: \"00dffd10-d567-431f-8dd9-390443f26d96\") " pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298371 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52j77\" (UniqueName: \"kubernetes.io/projected/bd443947-7241-49e3-9d98-f55329818dcc-kube-api-access-52j77\") pod \"multus-admission-controller-69db94689b-bkst6\" (UID: \"bd443947-7241-49e3-9d98-f55329818dcc\") " pod="openshift-multus/multus-admission-controller-69db94689b-bkst6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298407 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4c2755ce-817d-47b0-9f19-7218641d0c5b-console-config\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298431 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-secret-volume\") pod \"collect-profiles-29568960-fmcxl\" (UID: \"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298450 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcd47220-17b3-4593-9a12-fb67c0c0dcc8-cert\") pod \"ingress-canary-55dml\" (UID: \"dcd47220-17b3-4593-9a12-fb67c0c0dcc8\") " pod="openshift-ingress-canary/ingress-canary-55dml" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298472 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcwsm\" (UniqueName: \"kubernetes.io/projected/0eea7869-af21-4009-856f-65219d64ceea-kube-api-access-rcwsm\") pod \"machine-config-controller-f9cdd68f7-wfffg\" (UID: \"0eea7869-af21-4009-856f-65219d64ceea\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298495 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/44b1188c-0fa6-48c7-bf76-6e65ca8174ec-available-featuregates\") pod \"openshift-config-operator-5777786469-wb6r8\" (UID: \"44b1188c-0fa6-48c7-bf76-6e65ca8174ec\") " pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298524 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7b12ebb-b568-4d15-abde-14db5041d5d2-trusted-ca\") pod \"ingress-operator-6b9cb4dbcf-qdtdp\" (UID: \"c7b12ebb-b568-4d15-abde-14db5041d5d2\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298544 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a-metrics-tls\") pod \"dns-default-rkswl\" (UID: \"0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a\") " pod="openshift-dns/dns-default-rkswl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298567 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-etcd-ca\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298601 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bmdw\" (UniqueName: \"kubernetes.io/projected/00dffd10-d567-431f-8dd9-390443f26d96-kube-api-access-8bmdw\") pod \"router-default-68cf44c8b8-2jlxw\" (UID: \"00dffd10-d567-431f-8dd9-390443f26d96\") " pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298619 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-marketplace-operator-metrics\") pod \"marketplace-operator-547dbd544d-lf2zm\" (UID: \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298646 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf2fs\" (UniqueName: \"kubernetes.io/projected/dc9230ca-3a51-4ee3-976c-38c27605db87-kube-api-access-lf2fs\") pod \"machine-config-server-r75lk\" (UID: \"dc9230ca-3a51-4ee3-976c-38c27605db87\") " pod="openshift-machine-config-operator/machine-config-server-r75lk" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298668 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf866f04-6739-40da-8c1c-36d192472220-serving-cert\") pod \"kube-apiserver-operator-575994946d-jxbnl\" (UID: \"cf866f04-6739-40da-8c1c-36d192472220\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298718 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a-config-volume\") pod \"dns-default-rkswl\" (UID: \"0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a\") " pod="openshift-dns/dns-default-rkswl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298746 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0eea7869-af21-4009-856f-65219d64ceea-mcc-auth-proxy-config\") pod \"machine-config-controller-f9cdd68f7-wfffg\" (UID: \"0eea7869-af21-4009-856f-65219d64ceea\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298803 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxs7g\" (UniqueName: \"kubernetes.io/projected/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-kube-api-access-hxs7g\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298876 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a224637e-e693-4ae7-89c3-1a01e6c9a6f5-tmpfs\") pod \"packageserver-7d4fc7d867-td5gr\" (UID: \"a224637e-e693-4ae7-89c3-1a01e6c9a6f5\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298963 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4c2755ce-817d-47b0-9f19-7218641d0c5b-oauth-serving-cert\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.299004 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44b1188c-0fa6-48c7-bf76-6e65ca8174ec-serving-cert\") pod \"openshift-config-operator-5777786469-wb6r8\" (UID: \"44b1188c-0fa6-48c7-bf76-6e65ca8174ec\") " pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.299057 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-audit-dir\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.299089 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.299142 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1bb9e03f-ef85-4dbe-802f-d529e97b092c-registration-dir\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.299203 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bf5ae18-6e08-436b-939f-03347eda68a8-samples-operator-tls\") pod \"cluster-samples-operator-6b564684c8-dfx6t\" (UID: \"3bf5ae18-6e08-436b-939f-03347eda68a8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-dfx6t" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.299235 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted-pem\" (UniqueName: \"kubernetes.io/empty-dir/720e1d69-81b3-4fdb-94c1-dabb0707c833-ca-trust-extracted-pem\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.299293 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktdtc\" (UniqueName: \"kubernetes.io/projected/fe48c9c2-8783-475b-a961-d5a4110cb452-kube-api-access-ktdtc\") pod \"olm-operator-5cdf44d969-cp4p2\" (UID: \"fe48c9c2-8783-475b-a961-d5a4110cb452\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.299320 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a224637e-e693-4ae7-89c3-1a01e6c9a6f5-apiservice-cert\") pod \"packageserver-7d4fc7d867-td5gr\" (UID: \"a224637e-e693-4ae7-89c3-1a01e6c9a6f5\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.299389 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fe48c9c2-8783-475b-a961-d5a4110cb452-tmpfs\") pod \"olm-operator-5cdf44d969-cp4p2\" (UID: \"fe48c9c2-8783-475b-a961-d5a4110cb452\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.299439 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clsgn\" (UniqueName: \"kubernetes.io/projected/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-kube-api-access-clsgn\") pod \"collect-profiles-29568960-fmcxl\" (UID: \"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.299487 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-login\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.299540 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf866f04-6739-40da-8c1c-36d192472220-config\") pod \"kube-apiserver-operator-575994946d-jxbnl\" (UID: \"cf866f04-6739-40da-8c1c-36d192472220\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.299916 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.300443 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-bgxvm\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.300898 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4c2755ce-817d-47b0-9f19-7218641d0c5b-console-oauth-config\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.301405 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c2755ce-817d-47b0-9f19-7218641d0c5b-trusted-ca-bundle\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.301616 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c2755ce-817d-47b0-9f19-7218641d0c5b-console-serving-cert\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.302715 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-audit-dir\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.303388 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4c2755ce-817d-47b0-9f19-7218641d0c5b-oauth-serving-cert\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.303887 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/413bd8dc-5257-4fd7-95c1-01f6d79278ee-serving-cert\") pod \"openshift-controller-manager-operator-686468bdd5-pdqfv\" (UID: \"413bd8dc-5257-4fd7-95c1-01f6d79278ee\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.303903 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4c2755ce-817d-47b0-9f19-7218641d0c5b-console-config\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.304596 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-installation-pull-secrets\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.304930 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-error\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.305052 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.305104 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/44b1188c-0fa6-48c7-bf76-6e65ca8174ec-available-featuregates\") pod \"openshift-config-operator-5777786469-wb6r8\" (UID: \"44b1188c-0fa6-48c7-bf76-6e65ca8174ec\") " pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.305237 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4c2755ce-817d-47b0-9f19-7218641d0c5b-service-ca\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.305511 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-registry-tls\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.306220 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.307367 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.307493 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-login\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.308017 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4117709-89bd-4e72-8016-0c25c0ece2c6-serving-cert\") pod \"console-operator-67c89758df-vnd4f\" (UID: \"b4117709-89bd-4e72-8016-0c25c0ece2c6\") " pod="openshift-console-operator/console-operator-67c89758df-vnd4f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.308246 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bf5ae18-6e08-436b-939f-03347eda68a8-samples-operator-tls\") pod \"cluster-samples-operator-6b564684c8-dfx6t\" (UID: \"3bf5ae18-6e08-436b-939f-03347eda68a8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-dfx6t" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.308591 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44b1188c-0fa6-48c7-bf76-6e65ca8174ec-serving-cert\") pod \"openshift-config-operator-5777786469-wb6r8\" (UID: \"44b1188c-0fa6-48c7-bf76-6e65ca8174ec\") " pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.309491 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-router-certs\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.309958 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbrfl\" (UniqueName: \"kubernetes.io/projected/a442ae21-7eff-4990-998f-27afcb839a6c-kube-api-access-hbrfl\") pod \"openshift-apiserver-operator-846cbfc458-vdh4n\" (UID: \"a442ae21-7eff-4990-998f-27afcb839a6c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-vdh4n" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.312785 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-vdh4n" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.319745 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.320529 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/720e1d69-81b3-4fdb-94c1-dabb0707c833-tmp\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.320597 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted-pem\" (UniqueName: \"kubernetes.io/empty-dir/720e1d69-81b3-4fdb-94c1-dabb0707c833-ca-trust-extracted-pem\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.320945 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/720e1d69-81b3-4fdb-94c1-dabb0707c833-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.321404 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/720e1d69-81b3-4fdb-94c1-dabb0707c833-trusted-ca\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.357812 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzqwb\" (UniqueName: \"kubernetes.io/projected/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-kube-api-access-bzqwb\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.375284 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrpct\" (UniqueName: \"kubernetes.io/projected/45fa64e1-27bb-4b1f-bf62-4fa08b5dcfa0-kube-api-access-xrpct\") pod \"downloads-747b44746d-cb5p2\" (UID: \"45fa64e1-27bb-4b1f-bf62-4fa08b5dcfa0\") " pod="openshift-console/downloads-747b44746d-cb5p2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.401218 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.401652 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a1258288-8146-4cba-9d66-2a88e35a1fe9-tmpfs\") pod \"catalog-operator-75ff9f647d-47j6l\" (UID: \"a1258288-8146-4cba-9d66-2a88e35a1fe9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.401680 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-etcd-service-ca\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.401731 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/1bb9e03f-ef85-4dbe-802f-d529e97b092c-mountpoint-dir\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.401783 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00dffd10-d567-431f-8dd9-390443f26d96-service-ca-bundle\") pod \"router-default-68cf44c8b8-2jlxw\" (UID: \"00dffd10-d567-431f-8dd9-390443f26d96\") " pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.401808 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6xp2b\" (UniqueName: \"kubernetes.io/projected/dcd47220-17b3-4593-9a12-fb67c0c0dcc8-kube-api-access-6xp2b\") pod \"ingress-canary-55dml\" (UID: \"dcd47220-17b3-4593-9a12-fb67c0c0dcc8\") " pod="openshift-ingress-canary/ingress-canary-55dml" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.402059 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9vqx7\" (UniqueName: \"kubernetes.io/projected/a224637e-e693-4ae7-89c3-1a01e6c9a6f5-kube-api-access-9vqx7\") pod \"packageserver-7d4fc7d867-td5gr\" (UID: \"a224637e-e693-4ae7-89c3-1a01e6c9a6f5\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.402088 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a-tmp-dir\") pod \"dns-default-rkswl\" (UID: \"0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a\") " pod="openshift-dns/dns-default-rkswl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.402453 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gzqjt\" (UniqueName: \"kubernetes.io/projected/fc0602bb-5acd-426a-b3d6-3a2effb49bf3-kube-api-access-gzqjt\") pod \"package-server-manager-77f986bd66-ms24k\" (UID: \"fc0602bb-5acd-426a-b3d6-3a2effb49bf3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-ms24k" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.402645 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/00dffd10-d567-431f-8dd9-390443f26d96-default-certificate\") pod \"router-default-68cf44c8b8-2jlxw\" (UID: \"00dffd10-d567-431f-8dd9-390443f26d96\") " pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.402743 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/00dffd10-d567-431f-8dd9-390443f26d96-metrics-certs\") pod \"router-default-68cf44c8b8-2jlxw\" (UID: \"00dffd10-d567-431f-8dd9-390443f26d96\") " pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.402778 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-52j77\" (UniqueName: \"kubernetes.io/projected/bd443947-7241-49e3-9d98-f55329818dcc-kube-api-access-52j77\") pod \"multus-admission-controller-69db94689b-bkst6\" (UID: \"bd443947-7241-49e3-9d98-f55329818dcc\") " pod="openshift-multus/multus-admission-controller-69db94689b-bkst6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.402808 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-secret-volume\") pod \"collect-profiles-29568960-fmcxl\" (UID: \"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.402828 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcd47220-17b3-4593-9a12-fb67c0c0dcc8-cert\") pod \"ingress-canary-55dml\" (UID: \"dcd47220-17b3-4593-9a12-fb67c0c0dcc8\") " pod="openshift-ingress-canary/ingress-canary-55dml" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.402852 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rcwsm\" (UniqueName: \"kubernetes.io/projected/0eea7869-af21-4009-856f-65219d64ceea-kube-api-access-rcwsm\") pod \"machine-config-controller-f9cdd68f7-wfffg\" (UID: \"0eea7869-af21-4009-856f-65219d64ceea\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.402878 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7b12ebb-b568-4d15-abde-14db5041d5d2-trusted-ca\") pod \"ingress-operator-6b9cb4dbcf-qdtdp\" (UID: \"c7b12ebb-b568-4d15-abde-14db5041d5d2\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.402897 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a-metrics-tls\") pod \"dns-default-rkswl\" (UID: \"0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a\") " pod="openshift-dns/dns-default-rkswl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.402917 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-etcd-ca\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.402950 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8bmdw\" (UniqueName: \"kubernetes.io/projected/00dffd10-d567-431f-8dd9-390443f26d96-kube-api-access-8bmdw\") pod \"router-default-68cf44c8b8-2jlxw\" (UID: \"00dffd10-d567-431f-8dd9-390443f26d96\") " pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.402973 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-marketplace-operator-metrics\") pod \"marketplace-operator-547dbd544d-lf2zm\" (UID: \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403005 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lf2fs\" (UniqueName: \"kubernetes.io/projected/dc9230ca-3a51-4ee3-976c-38c27605db87-kube-api-access-lf2fs\") pod \"machine-config-server-r75lk\" (UID: \"dc9230ca-3a51-4ee3-976c-38c27605db87\") " pod="openshift-machine-config-operator/machine-config-server-r75lk" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403031 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf866f04-6739-40da-8c1c-36d192472220-serving-cert\") pod \"kube-apiserver-operator-575994946d-jxbnl\" (UID: \"cf866f04-6739-40da-8c1c-36d192472220\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403056 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a-config-volume\") pod \"dns-default-rkswl\" (UID: \"0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a\") " pod="openshift-dns/dns-default-rkswl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403077 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0eea7869-af21-4009-856f-65219d64ceea-mcc-auth-proxy-config\") pod \"machine-config-controller-f9cdd68f7-wfffg\" (UID: \"0eea7869-af21-4009-856f-65219d64ceea\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403101 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxs7g\" (UniqueName: \"kubernetes.io/projected/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-kube-api-access-hxs7g\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403134 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a224637e-e693-4ae7-89c3-1a01e6c9a6f5-tmpfs\") pod \"packageserver-7d4fc7d867-td5gr\" (UID: \"a224637e-e693-4ae7-89c3-1a01e6c9a6f5\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403193 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1bb9e03f-ef85-4dbe-802f-d529e97b092c-registration-dir\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403228 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ktdtc\" (UniqueName: \"kubernetes.io/projected/fe48c9c2-8783-475b-a961-d5a4110cb452-kube-api-access-ktdtc\") pod \"olm-operator-5cdf44d969-cp4p2\" (UID: \"fe48c9c2-8783-475b-a961-d5a4110cb452\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403251 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a224637e-e693-4ae7-89c3-1a01e6c9a6f5-apiservice-cert\") pod \"packageserver-7d4fc7d867-td5gr\" (UID: \"a224637e-e693-4ae7-89c3-1a01e6c9a6f5\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403276 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fe48c9c2-8783-475b-a961-d5a4110cb452-tmpfs\") pod \"olm-operator-5cdf44d969-cp4p2\" (UID: \"fe48c9c2-8783-475b-a961-d5a4110cb452\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403301 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clsgn\" (UniqueName: \"kubernetes.io/projected/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-kube-api-access-clsgn\") pod \"collect-profiles-29568960-fmcxl\" (UID: \"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403325 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf866f04-6739-40da-8c1c-36d192472220-config\") pod \"kube-apiserver-operator-575994946d-jxbnl\" (UID: \"cf866f04-6739-40da-8c1c-36d192472220\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403352 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/1bb9e03f-ef85-4dbe-802f-d529e97b092c-csi-data-dir\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403375 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a1258288-8146-4cba-9d66-2a88e35a1fe9-srv-cert\") pod \"catalog-operator-75ff9f647d-47j6l\" (UID: \"a1258288-8146-4cba-9d66-2a88e35a1fe9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403406 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-57nbs\" (UID: \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403452 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-config\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403482 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/dc9230ca-3a51-4ee3-976c-38c27605db87-node-bootstrap-token\") pod \"machine-config-server-r75lk\" (UID: \"dc9230ca-3a51-4ee3-976c-38c27605db87\") " pod="openshift-machine-config-operator/machine-config-server-r75lk" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403506 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/56318568-cab8-4d5b-9a20-4531fc8aad60-kube-api-access\") pod \"openshift-kube-scheduler-operator-54f497555d-lmct6\" (UID: \"56318568-cab8-4d5b-9a20-4531fc8aad60\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403529 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-psgc4\" (UniqueName: \"kubernetes.io/projected/50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee-kube-api-access-psgc4\") pod \"machine-config-operator-67c9d58cbb-l9f6f\" (UID: \"50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403556 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2512f5ef-a611-4637-b41f-41185def421b-serving-cert\") pod \"kube-controller-manager-operator-69d5f845f8-sk5s4\" (UID: \"2512f5ef-a611-4637-b41f-41185def421b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403591 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/56318568-cab8-4d5b-9a20-4531fc8aad60-tmp\") pod \"openshift-kube-scheduler-operator-54f497555d-lmct6\" (UID: \"56318568-cab8-4d5b-9a20-4531fc8aad60\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403615 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prql2\" (UniqueName: \"kubernetes.io/projected/0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a-kube-api-access-prql2\") pod \"dns-default-rkswl\" (UID: \"0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a\") " pod="openshift-dns/dns-default-rkswl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403658 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56318568-cab8-4d5b-9a20-4531fc8aad60-config\") pod \"openshift-kube-scheduler-operator-54f497555d-lmct6\" (UID: \"56318568-cab8-4d5b-9a20-4531fc8aad60\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403682 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fe48c9c2-8783-475b-a961-d5a4110cb452-srv-cert\") pod \"olm-operator-5cdf44d969-cp4p2\" (UID: \"fe48c9c2-8783-475b-a961-d5a4110cb452\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403704 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-ready\") pod \"cni-sysctl-allowlist-ds-57nbs\" (UID: \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403736 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-marketplace-trusted-ca\") pod \"marketplace-operator-547dbd544d-lf2zm\" (UID: \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403772 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fe48c9c2-8783-475b-a961-d5a4110cb452-profile-collector-cert\") pod \"olm-operator-5cdf44d969-cp4p2\" (UID: \"fe48c9c2-8783-475b-a961-d5a4110cb452\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403794 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ttvwk\" (UniqueName: \"kubernetes.io/projected/c7b12ebb-b568-4d15-abde-14db5041d5d2-kube-api-access-ttvwk\") pod \"ingress-operator-6b9cb4dbcf-qdtdp\" (UID: \"c7b12ebb-b568-4d15-abde-14db5041d5d2\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403818 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fkbsf\" (UniqueName: \"kubernetes.io/projected/bd4c0dd5-6c39-4661-b0c6-424d6b061f04-kube-api-access-fkbsf\") pod \"service-ca-74545575db-z8df4\" (UID: \"bd4c0dd5-6c39-4661-b0c6-424d6b061f04\") " pod="openshift-service-ca/service-ca-74545575db-z8df4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403841 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lck5k\" (UniqueName: \"kubernetes.io/projected/a1258288-8146-4cba-9d66-2a88e35a1fe9-kube-api-access-lck5k\") pod \"catalog-operator-75ff9f647d-47j6l\" (UID: \"a1258288-8146-4cba-9d66-2a88e35a1fe9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403865 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/65b5ebf3-054c-4827-96b2-7ea0a26f20af-tmp-dir\") pod \"dns-operator-799b87ffcd-42tp2\" (UID: \"65b5ebf3-054c-4827-96b2-7ea0a26f20af\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-42tp2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403892 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee-auth-proxy-config\") pod \"machine-config-operator-67c9d58cbb-l9f6f\" (UID: \"50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403918 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6g2r\" (UniqueName: \"kubernetes.io/projected/73928a0d-7a97-4c03-a5e8-6ab37119261c-kube-api-access-p6g2r\") pod \"migrator-866fcbc849-m5rc6\" (UID: \"73928a0d-7a97-4c03-a5e8-6ab37119261c\") " pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-m5rc6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403942 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-serving-cert\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403977 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5xgch\" (UniqueName: \"kubernetes.io/projected/5df5e2ea-4661-49c7-95f2-d8c039bbea5d-kube-api-access-5xgch\") pod \"service-ca-operator-5b9c976747-zvprn\" (UID: \"5df5e2ea-4661-49c7-95f2-d8c039bbea5d\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-zvprn" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403999 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c7b12ebb-b568-4d15-abde-14db5041d5d2-metrics-tls\") pod \"ingress-operator-6b9cb4dbcf-qdtdp\" (UID: \"c7b12ebb-b568-4d15-abde-14db5041d5d2\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.404028 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0910a8a1-0226-42c8-ab1d-b142d2b7a00d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-75ffdb6fcd-m65lb\" (UID: \"0910a8a1-0226-42c8-ab1d-b142d2b7a00d\") " pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m65lb" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.404058 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c7b12ebb-b568-4d15-abde-14db5041d5d2-bound-sa-token\") pod \"ingress-operator-6b9cb4dbcf-qdtdp\" (UID: \"c7b12ebb-b568-4d15-abde-14db5041d5d2\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.404088 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5df5e2ea-4661-49c7-95f2-d8c039bbea5d-serving-cert\") pod \"service-ca-operator-5b9c976747-zvprn\" (UID: \"5df5e2ea-4661-49c7-95f2-d8c039bbea5d\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-zvprn" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.404123 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7mkbf\" (UniqueName: \"kubernetes.io/projected/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-kube-api-access-7mkbf\") pod \"cni-sysctl-allowlist-ds-57nbs\" (UID: \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.404151 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/bd4c0dd5-6c39-4661-b0c6-424d6b061f04-signing-cabundle\") pod \"service-ca-74545575db-z8df4\" (UID: \"bd4c0dd5-6c39-4661-b0c6-424d6b061f04\") " pod="openshift-service-ca/service-ca-74545575db-z8df4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.404623 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a224637e-e693-4ae7-89c3-1a01e6c9a6f5-webhook-cert\") pod \"packageserver-7d4fc7d867-td5gr\" (UID: \"a224637e-e693-4ae7-89c3-1a01e6c9a6f5\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" Mar 22 00:10:39 crc kubenswrapper[5116]: E0322 00:10:39.404694 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:39.904677277 +0000 UTC m=+110.926978640 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.404785 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93e67bda-2839-4364-9a75-54864090dc1f-config\") pod \"kube-storage-version-migrator-operator-565b79b866-tlh4v\" (UID: \"93e67bda-2839-4364-9a75-54864090dc1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-tlh4v" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.404814 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee-images\") pod \"machine-config-operator-67c9d58cbb-l9f6f\" (UID: \"50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.404959 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a1258288-8146-4cba-9d66-2a88e35a1fe9-profile-collector-cert\") pod \"catalog-operator-75ff9f647d-47j6l\" (UID: \"a1258288-8146-4cba-9d66-2a88e35a1fe9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.404980 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a1258288-8146-4cba-9d66-2a88e35a1fe9-tmpfs\") pod \"catalog-operator-75ff9f647d-47j6l\" (UID: \"a1258288-8146-4cba-9d66-2a88e35a1fe9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.404984 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/00dffd10-d567-431f-8dd9-390443f26d96-stats-auth\") pod \"router-default-68cf44c8b8-2jlxw\" (UID: \"00dffd10-d567-431f-8dd9-390443f26d96\") " pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405058 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cf866f04-6739-40da-8c1c-36d192472220-tmp-dir\") pod \"kube-apiserver-operator-575994946d-jxbnl\" (UID: \"cf866f04-6739-40da-8c1c-36d192472220\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405077 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2512f5ef-a611-4637-b41f-41185def421b-tmp-dir\") pod \"kube-controller-manager-operator-69d5f845f8-sk5s4\" (UID: \"2512f5ef-a611-4637-b41f-41185def421b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405096 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-etcd-client\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405134 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56318568-cab8-4d5b-9a20-4531fc8aad60-serving-cert\") pod \"openshift-kube-scheduler-operator-54f497555d-lmct6\" (UID: \"56318568-cab8-4d5b-9a20-4531fc8aad60\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405188 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/1bb9e03f-ef85-4dbe-802f-d529e97b092c-plugins-dir\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405206 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0eea7869-af21-4009-856f-65219d64ceea-proxy-tls\") pod \"machine-config-controller-f9cdd68f7-wfffg\" (UID: \"0eea7869-af21-4009-856f-65219d64ceea\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405225 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-57nbs\" (UID: \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405243 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/dc9230ca-3a51-4ee3-976c-38c27605db87-certs\") pod \"machine-config-server-r75lk\" (UID: \"dc9230ca-3a51-4ee3-976c-38c27605db87\") " pod="openshift-machine-config-operator/machine-config-server-r75lk" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405266 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2512f5ef-a611-4637-b41f-41185def421b-config\") pod \"kube-controller-manager-operator-69d5f845f8-sk5s4\" (UID: \"2512f5ef-a611-4637-b41f-41185def421b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405286 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1bb9e03f-ef85-4dbe-802f-d529e97b092c-socket-dir\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405311 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h4fhd\" (UniqueName: \"kubernetes.io/projected/1bb9e03f-ef85-4dbe-802f-d529e97b092c-kube-api-access-h4fhd\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405336 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crwp9\" (UniqueName: \"kubernetes.io/projected/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-kube-api-access-crwp9\") pod \"marketplace-operator-547dbd544d-lf2zm\" (UID: \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405363 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf866f04-6739-40da-8c1c-36d192472220-kube-api-access\") pod \"kube-apiserver-operator-575994946d-jxbnl\" (UID: \"cf866f04-6739-40da-8c1c-36d192472220\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405388 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee-proxy-tls\") pod \"machine-config-operator-67c9d58cbb-l9f6f\" (UID: \"50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405410 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sg9xh\" (UniqueName: \"kubernetes.io/projected/65b5ebf3-054c-4827-96b2-7ea0a26f20af-kube-api-access-sg9xh\") pod \"dns-operator-799b87ffcd-42tp2\" (UID: \"65b5ebf3-054c-4827-96b2-7ea0a26f20af\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-42tp2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405427 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5df5e2ea-4661-49c7-95f2-d8c039bbea5d-config\") pod \"service-ca-operator-5b9c976747-zvprn\" (UID: \"5df5e2ea-4661-49c7-95f2-d8c039bbea5d\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-zvprn" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405453 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-config-volume\") pod \"collect-profiles-29568960-fmcxl\" (UID: \"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405476 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc0602bb-5acd-426a-b3d6-3a2effb49bf3-package-server-manager-serving-cert\") pod \"package-server-manager-77f986bd66-ms24k\" (UID: \"fc0602bb-5acd-426a-b3d6-3a2effb49bf3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-ms24k" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405502 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-tmp-dir\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405527 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93e67bda-2839-4364-9a75-54864090dc1f-serving-cert\") pod \"kube-storage-version-migrator-operator-565b79b866-tlh4v\" (UID: \"93e67bda-2839-4364-9a75-54864090dc1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-tlh4v" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405585 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mcvxk\" (UniqueName: \"kubernetes.io/projected/0910a8a1-0226-42c8-ab1d-b142d2b7a00d-kube-api-access-mcvxk\") pod \"control-plane-machine-set-operator-75ffdb6fcd-m65lb\" (UID: \"0910a8a1-0226-42c8-ab1d-b142d2b7a00d\") " pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m65lb" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405836 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-tmp\") pod \"marketplace-operator-547dbd544d-lf2zm\" (UID: \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405863 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd443947-7241-49e3-9d98-f55329818dcc-webhook-certs\") pod \"multus-admission-controller-69db94689b-bkst6\" (UID: \"bd443947-7241-49e3-9d98-f55329818dcc\") " pod="openshift-multus/multus-admission-controller-69db94689b-bkst6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405882 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2512f5ef-a611-4637-b41f-41185def421b-kube-api-access\") pod \"kube-controller-manager-operator-69d5f845f8-sk5s4\" (UID: \"2512f5ef-a611-4637-b41f-41185def421b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405898 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/bd4c0dd5-6c39-4661-b0c6-424d6b061f04-signing-key\") pod \"service-ca-74545575db-z8df4\" (UID: \"bd4c0dd5-6c39-4661-b0c6-424d6b061f04\") " pod="openshift-service-ca/service-ca-74545575db-z8df4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405921 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/65b5ebf3-054c-4827-96b2-7ea0a26f20af-metrics-tls\") pod \"dns-operator-799b87ffcd-42tp2\" (UID: \"65b5ebf3-054c-4827-96b2-7ea0a26f20af\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-42tp2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405972 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rjsf\" (UniqueName: \"kubernetes.io/projected/93e67bda-2839-4364-9a75-54864090dc1f-kube-api-access-4rjsf\") pod \"kube-storage-version-migrator-operator-565b79b866-tlh4v\" (UID: \"93e67bda-2839-4364-9a75-54864090dc1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-tlh4v" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.406597 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/56318568-cab8-4d5b-9a20-4531fc8aad60-tmp\") pod \"openshift-kube-scheduler-operator-54f497555d-lmct6\" (UID: \"56318568-cab8-4d5b-9a20-4531fc8aad60\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.407400 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0eea7869-af21-4009-856f-65219d64ceea-mcc-auth-proxy-config\") pod \"machine-config-controller-f9cdd68f7-wfffg\" (UID: \"0eea7869-af21-4009-856f-65219d64ceea\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.407586 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-57nbs\" (UID: \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.408035 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a224637e-e693-4ae7-89c3-1a01e6c9a6f5-tmpfs\") pod \"packageserver-7d4fc7d867-td5gr\" (UID: \"a224637e-e693-4ae7-89c3-1a01e6c9a6f5\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.408139 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-config\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.408461 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1bb9e03f-ef85-4dbe-802f-d529e97b092c-registration-dir\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.404626 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7b12ebb-b568-4d15-abde-14db5041d5d2-trusted-ca\") pod \"ingress-operator-6b9cb4dbcf-qdtdp\" (UID: \"c7b12ebb-b568-4d15-abde-14db5041d5d2\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.409000 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-etcd-service-ca\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.409082 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/1bb9e03f-ef85-4dbe-802f-d529e97b092c-mountpoint-dir\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.409975 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq7tp\" (UniqueName: \"kubernetes.io/projected/bd9a9e58-4363-4eaf-a27c-21aacaeea0a4-kube-api-access-tq7tp\") pod \"machine-approver-54c688565-5drp9\" (UID: \"bd9a9e58-4363-4eaf-a27c-21aacaeea0a4\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.410580 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a-tmp-dir\") pod \"dns-default-rkswl\" (UID: \"0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a\") " pod="openshift-dns/dns-default-rkswl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.410810 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c7b12ebb-b568-4d15-abde-14db5041d5d2-metrics-tls\") pod \"ingress-operator-6b9cb4dbcf-qdtdp\" (UID: \"c7b12ebb-b568-4d15-abde-14db5041d5d2\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.410953 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93e67bda-2839-4364-9a75-54864090dc1f-config\") pod \"kube-storage-version-migrator-operator-565b79b866-tlh4v\" (UID: \"93e67bda-2839-4364-9a75-54864090dc1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-tlh4v" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.410999 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf866f04-6739-40da-8c1c-36d192472220-serving-cert\") pod \"kube-apiserver-operator-575994946d-jxbnl\" (UID: \"cf866f04-6739-40da-8c1c-36d192472220\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.411156 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee-images\") pod \"machine-config-operator-67c9d58cbb-l9f6f\" (UID: \"50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.411805 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/bd4c0dd5-6c39-4661-b0c6-424d6b061f04-signing-cabundle\") pod \"service-ca-74545575db-z8df4\" (UID: \"bd4c0dd5-6c39-4661-b0c6-424d6b061f04\") " pod="openshift-service-ca/service-ca-74545575db-z8df4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.411951 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee-auth-proxy-config\") pod \"machine-config-operator-67c9d58cbb-l9f6f\" (UID: \"50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.412016 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-etcd-ca\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.412347 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1bb9e03f-ef85-4dbe-802f-d529e97b092c-socket-dir\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.412394 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/65b5ebf3-054c-4827-96b2-7ea0a26f20af-tmp-dir\") pod \"dns-operator-799b87ffcd-42tp2\" (UID: \"65b5ebf3-054c-4827-96b2-7ea0a26f20af\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-42tp2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.412949 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-ready\") pod \"cni-sysctl-allowlist-ds-57nbs\" (UID: \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.413025 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fe48c9c2-8783-475b-a961-d5a4110cb452-tmpfs\") pod \"olm-operator-5cdf44d969-cp4p2\" (UID: \"fe48c9c2-8783-475b-a961-d5a4110cb452\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.413255 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf866f04-6739-40da-8c1c-36d192472220-config\") pod \"kube-apiserver-operator-575994946d-jxbnl\" (UID: \"cf866f04-6739-40da-8c1c-36d192472220\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.413418 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2512f5ef-a611-4637-b41f-41185def421b-tmp-dir\") pod \"kube-controller-manager-operator-69d5f845f8-sk5s4\" (UID: \"2512f5ef-a611-4637-b41f-41185def421b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.414014 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/1bb9e03f-ef85-4dbe-802f-d529e97b092c-plugins-dir\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.414084 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fe48c9c2-8783-475b-a961-d5a4110cb452-profile-collector-cert\") pod \"olm-operator-5cdf44d969-cp4p2\" (UID: \"fe48c9c2-8783-475b-a961-d5a4110cb452\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.413572 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/1bb9e03f-ef85-4dbe-802f-d529e97b092c-csi-data-dir\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.414410 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56318568-cab8-4d5b-9a20-4531fc8aad60-config\") pod \"openshift-kube-scheduler-operator-54f497555d-lmct6\" (UID: \"56318568-cab8-4d5b-9a20-4531fc8aad60\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.414432 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cf866f04-6739-40da-8c1c-36d192472220-tmp-dir\") pod \"kube-apiserver-operator-575994946d-jxbnl\" (UID: \"cf866f04-6739-40da-8c1c-36d192472220\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.414453 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2512f5ef-a611-4637-b41f-41185def421b-serving-cert\") pod \"kube-controller-manager-operator-69d5f845f8-sk5s4\" (UID: \"2512f5ef-a611-4637-b41f-41185def421b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.414667 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00dffd10-d567-431f-8dd9-390443f26d96-service-ca-bundle\") pod \"router-default-68cf44c8b8-2jlxw\" (UID: \"00dffd10-d567-431f-8dd9-390443f26d96\") " pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.414714 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-tmp-dir\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.414957 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2512f5ef-a611-4637-b41f-41185def421b-config\") pod \"kube-controller-manager-operator-69d5f845f8-sk5s4\" (UID: \"2512f5ef-a611-4637-b41f-41185def421b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.415870 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/00dffd10-d567-431f-8dd9-390443f26d96-default-certificate\") pod \"router-default-68cf44c8b8-2jlxw\" (UID: \"00dffd10-d567-431f-8dd9-390443f26d96\") " pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.416436 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/00dffd10-d567-431f-8dd9-390443f26d96-stats-auth\") pod \"router-default-68cf44c8b8-2jlxw\" (UID: \"00dffd10-d567-431f-8dd9-390443f26d96\") " pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.417234 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-tmp\") pod \"marketplace-operator-547dbd544d-lf2zm\" (UID: \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.418001 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-etcd-client\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.418262 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93e67bda-2839-4364-9a75-54864090dc1f-serving-cert\") pod \"kube-storage-version-migrator-operator-565b79b866-tlh4v\" (UID: \"93e67bda-2839-4364-9a75-54864090dc1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-tlh4v" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.419830 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7xbb\" (UniqueName: \"kubernetes.io/projected/9884d9ba-fbeb-40db-8105-de302262478b-kube-api-access-c7xbb\") pod \"machine-api-operator-755bb95488-w2nq2\" (UID: \"9884d9ba-fbeb-40db-8105-de302262478b\") " pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.419920 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56318568-cab8-4d5b-9a20-4531fc8aad60-serving-cert\") pod \"openshift-kube-scheduler-operator-54f497555d-lmct6\" (UID: \"56318568-cab8-4d5b-9a20-4531fc8aad60\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.420236 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-secret-volume\") pod \"collect-profiles-29568960-fmcxl\" (UID: \"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.420710 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-serving-cert\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.420814 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a1258288-8146-4cba-9d66-2a88e35a1fe9-srv-cert\") pod \"catalog-operator-75ff9f647d-47j6l\" (UID: \"a1258288-8146-4cba-9d66-2a88e35a1fe9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.420837 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0eea7869-af21-4009-856f-65219d64ceea-proxy-tls\") pod \"machine-config-controller-f9cdd68f7-wfffg\" (UID: \"0eea7869-af21-4009-856f-65219d64ceea\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.421361 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a224637e-e693-4ae7-89c3-1a01e6c9a6f5-apiservice-cert\") pod \"packageserver-7d4fc7d867-td5gr\" (UID: \"a224637e-e693-4ae7-89c3-1a01e6c9a6f5\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.421551 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/00dffd10-d567-431f-8dd9-390443f26d96-metrics-certs\") pod \"router-default-68cf44c8b8-2jlxw\" (UID: \"00dffd10-d567-431f-8dd9-390443f26d96\") " pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.422109 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a1258288-8146-4cba-9d66-2a88e35a1fe9-profile-collector-cert\") pod \"catalog-operator-75ff9f647d-47j6l\" (UID: \"a1258288-8146-4cba-9d66-2a88e35a1fe9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.422213 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee-proxy-tls\") pod \"machine-config-operator-67c9d58cbb-l9f6f\" (UID: \"50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.422304 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fe48c9c2-8783-475b-a961-d5a4110cb452-srv-cert\") pod \"olm-operator-5cdf44d969-cp4p2\" (UID: \"fe48c9c2-8783-475b-a961-d5a4110cb452\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.422513 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd443947-7241-49e3-9d98-f55329818dcc-webhook-certs\") pod \"multus-admission-controller-69db94689b-bkst6\" (UID: \"bd443947-7241-49e3-9d98-f55329818dcc\") " pod="openshift-multus/multus-admission-controller-69db94689b-bkst6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.422964 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/65b5ebf3-054c-4827-96b2-7ea0a26f20af-metrics-tls\") pod \"dns-operator-799b87ffcd-42tp2\" (UID: \"65b5ebf3-054c-4827-96b2-7ea0a26f20af\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-42tp2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.423013 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a224637e-e693-4ae7-89c3-1a01e6c9a6f5-webhook-cert\") pod \"packageserver-7d4fc7d867-td5gr\" (UID: \"a224637e-e693-4ae7-89c3-1a01e6c9a6f5\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.423638 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc0602bb-5acd-426a-b3d6-3a2effb49bf3-package-server-manager-serving-cert\") pod \"package-server-manager-77f986bd66-ms24k\" (UID: \"fc0602bb-5acd-426a-b3d6-3a2effb49bf3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-ms24k" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.428596 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.433875 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/bd4c0dd5-6c39-4661-b0c6-424d6b061f04-signing-key\") pod \"service-ca-74545575db-z8df4\" (UID: \"bd4c0dd5-6c39-4661-b0c6-424d6b061f04\") " pod="openshift-service-ca/service-ca-74545575db-z8df4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.472187 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.481368 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9648k\" (UniqueName: \"kubernetes.io/projected/bac29c51-c815-4827-bd3d-c74f2e31f842-kube-api-access-9648k\") pod \"authentication-operator-7f5c659b84-4vmb4\" (UID: \"bac29c51-c815-4827-bd3d-c74f2e31f842\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.484316 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pngr\" (UniqueName: \"kubernetes.io/projected/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-kube-api-access-4pngr\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.484745 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-vdh4n"] Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.485419 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9lrd\" (UniqueName: \"kubernetes.io/projected/f8b3ade2-2521-43a2-a5fc-2c33d19f3a58-kube-api-access-b9lrd\") pod \"image-pruner-29568960-tjk88\" (UID: \"f8b3ade2-2521-43a2-a5fc-2c33d19f3a58\") " pod="openshift-image-registry/image-pruner-29568960-tjk88" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.488271 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" Mar 22 00:10:39 crc kubenswrapper[5116]: W0322 00:10:39.496068 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda442ae21_7eff_4990_998f_27afcb839a6c.slice/crio-41b30eca4ad92a4f782208036b6ab2600632c2f03500e7afa53c584f6c389d34 WatchSource:0}: Error finding container 41b30eca4ad92a4f782208036b6ab2600632c2f03500e7afa53c584f6c389d34: Status 404 returned error can't find the container with id 41b30eca4ad92a4f782208036b6ab2600632c2f03500e7afa53c584f6c389d34 Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.496836 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp728\" (UniqueName: \"kubernetes.io/projected/5f51f3b4-6887-42b5-ad77-5a2f349a162a-kube-api-access-qp728\") pod \"route-controller-manager-776cdc94d6-fw5k5\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.499416 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-api\"/\"control-plane-machine-set-operator-tls\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.507631 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: E0322 00:10:39.507979 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:40.007966374 +0000 UTC m=+111.030267747 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.512634 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0910a8a1-0226-42c8-ab1d-b142d2b7a00d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-75ffdb6fcd-m65lb\" (UID: \"0910a8a1-0226-42c8-ab1d-b142d2b7a00d\") " pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m65lb" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.519218 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-api\"/\"control-plane-machine-set-operator-dockercfg-gnx66\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.519331 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.539956 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.554712 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-747b44746d-cb5p2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.561786 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.563749 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5df5e2ea-4661-49c7-95f2-d8c039bbea5d-config\") pod \"service-ca-operator-5b9c976747-zvprn\" (UID: \"5df5e2ea-4661-49c7-95f2-d8c039bbea5d\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-zvprn" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.566824 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5df5e2ea-4661-49c7-95f2-d8c039bbea5d-serving-cert\") pod \"service-ca-operator-5b9c976747-zvprn\" (UID: \"5df5e2ea-4661-49c7-95f2-d8c039bbea5d\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-zvprn" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.579933 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-bjqfd\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.585531 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29568960-tjk88" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.600620 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.606515 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.608304 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:39 crc kubenswrapper[5116]: E0322 00:10:39.608900 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:40.108884717 +0000 UTC m=+111.131186090 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.621338 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.622769 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.640476 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"marketplace-operator-dockercfg-2cfkp\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.669227 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"marketplace-trusted-ca\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.673073 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-9ddfb9f55-m5dds"] Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.682705 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-marketplace-trusted-ca\") pod \"marketplace-operator-547dbd544d-lf2zm\" (UID: \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.683204 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"marketplace-operator-metrics\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.688665 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-marketplace-operator-metrics\") pod \"marketplace-operator-547dbd544d-lf2zm\" (UID: \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.699963 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.705617 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-65b6cccf98-9kdkj"] Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.710598 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: E0322 00:10:39.710962 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:40.210947576 +0000 UTC m=+111.233248949 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.722400 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.735275 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-755bb95488-w2nq2"] Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.744884 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-server-dockercfg-dzw6b\"" Mar 22 00:10:39 crc kubenswrapper[5116]: W0322 00:10:39.756213 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9884d9ba_fbeb_40db_8105_de302262478b.slice/crio-0d2a81b78273f088b81197ca78ec22abc78cc5fbdc1b3349477a5158392205a8 WatchSource:0}: Error finding container 0d2a81b78273f088b81197ca78ec22abc78cc5fbdc1b3349477a5158392205a8: Status 404 returned error can't find the container with id 0d2a81b78273f088b81197ca78ec22abc78cc5fbdc1b3349477a5158392205a8 Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.758413 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.759143 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"node-bootstrapper-token\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.772878 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/dc9230ca-3a51-4ee3-976c-38c27605db87-node-bootstrap-token\") pod \"machine-config-server-r75lk\" (UID: \"dc9230ca-3a51-4ee3-976c-38c27605db87\") " pod="openshift-machine-config-operator/machine-config-server-r75lk" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.782976 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-server-tls\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.787909 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-747b44746d-cb5p2"] Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.796737 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/dc9230ca-3a51-4ee3-976c-38c27605db87-certs\") pod \"machine-config-server-r75lk\" (UID: \"dc9230ca-3a51-4ee3-976c-38c27605db87\") " pod="openshift-machine-config-operator/machine-config-server-r75lk" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.803575 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9pgs7\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.811930 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:39 crc kubenswrapper[5116]: E0322 00:10:39.812876 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:40.312855849 +0000 UTC m=+111.335157222 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.827415 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.838072 5116 request.go:752] "Waited before sending request" delay="1.92430841s" reason="client-side throttling, not priority and fairness" verb="GET" URL="https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.839205 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcd47220-17b3-4593-9a12-fb67c0c0dcc8-cert\") pod \"ingress-canary-55dml\" (UID: \"dcd47220-17b3-4593-9a12-fb67c0c0dcc8\") " pod="openshift-ingress-canary/ingress-canary-55dml" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.841761 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.875652 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.879457 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-dockercfg-vfqp6\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.899560 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4"] Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.902854 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-config\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.903645 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-config-volume\") pod \"collect-profiles-29568960-fmcxl\" (UID: \"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" Mar 22 00:10:39 crc kubenswrapper[5116]: W0322 00:10:39.910670 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbac29c51_c815_4827_bd3d_c74f2e31f842.slice/crio-d4ef373b94d181afedea5dabd5fee414987f702b031e6137da1403a3e31e3f9b WatchSource:0}: Error finding container d4ef373b94d181afedea5dabd5fee414987f702b031e6137da1403a3e31e3f9b: Status 404 returned error can't find the container with id d4ef373b94d181afedea5dabd5fee414987f702b031e6137da1403a3e31e3f9b Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.914384 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: E0322 00:10:39.914679 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:40.41466891 +0000 UTC m=+111.436970283 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.918183 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5"] Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.923614 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"hostpath-provisioner\"/\"csi-hostpath-provisioner-sa-dockercfg-7dcws\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.938989 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"hostpath-provisioner\"/\"kube-root-ca.crt\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.947053 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29568960-tjk88"] Mar 22 00:10:39 crc kubenswrapper[5116]: W0322 00:10:39.956535 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f51f3b4_6887_42b5_ad77_5a2f349a162a.slice/crio-b7695e77d66bbd117e7c0ab58ba5188e43f6d887ec5e97574683b7c2f512fe56 WatchSource:0}: Error finding container b7695e77d66bbd117e7c0ab58ba5188e43f6d887ec5e97574683b7c2f512fe56: Status 404 returned error can't find the container with id b7695e77d66bbd117e7c0ab58ba5188e43f6d887ec5e97574683b7c2f512fe56 Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.958922 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"hostpath-provisioner\"/\"openshift-service-ca.crt\"" Mar 22 00:10:39 crc kubenswrapper[5116]: W0322 00:10:39.968897 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8b3ade2_2521_43a2_a5fc_2c33d19f3a58.slice/crio-82ad6b215452bb6301a7576bc4e5f0e0d906844913073e724aa03d477079324c WatchSource:0}: Error finding container 82ad6b215452bb6301a7576bc4e5f0e0d906844913073e724aa03d477079324c: Status 404 returned error can't find the container with id 82ad6b215452bb6301a7576bc4e5f0e0d906844913073e724aa03d477079324c Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.979208 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-sysctl-allowlist\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.986042 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-57nbs\" (UID: \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.000532 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.007210 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a-config-volume\") pod \"dns-default-rkswl\" (UID: \"0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a\") " pod="openshift-dns/dns-default-rkswl" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.015924 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:40 crc kubenswrapper[5116]: E0322 00:10:40.016085 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:40.516066928 +0000 UTC m=+111.538368301 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.016306 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:40 crc kubenswrapper[5116]: E0322 00:10:40.016718 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:40.516710347 +0000 UTC m=+111.539011720 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.019718 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-kpvmz\"" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.021068 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-8596bd845d-f59q2"] Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.039870 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.048218 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a-metrics-tls\") pod \"dns-default-rkswl\" (UID: \"0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a\") " pod="openshift-dns/dns-default-rkswl" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.062924 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.079710 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.099253 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.117487 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:40 crc kubenswrapper[5116]: E0322 00:10:40.117876 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:40.617853887 +0000 UTC m=+111.640155260 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.120008 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.174254 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" event={"ID":"5f51f3b4-6887-42b5-ad77-5a2f349a162a","Type":"ContainerStarted","Data":"b7695e77d66bbd117e7c0ab58ba5188e43f6d887ec5e97574683b7c2f512fe56"} Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.176002 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29568960-tjk88" event={"ID":"f8b3ade2-2521-43a2-a5fc-2c33d19f3a58","Type":"ContainerStarted","Data":"82ad6b215452bb6301a7576bc4e5f0e0d906844913073e724aa03d477079324c"} Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.178039 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl2cn\" (UniqueName: \"kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-kube-api-access-hl2cn\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.178120 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-747b44746d-cb5p2" event={"ID":"45fa64e1-27bb-4b1f-bf62-4fa08b5dcfa0","Type":"ContainerStarted","Data":"8d50655ab10e679cee022ffa16220973ee55f67aaf4152a787d4b6f4d6b3136d"} Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.178158 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-747b44746d-cb5p2" event={"ID":"45fa64e1-27bb-4b1f-bf62-4fa08b5dcfa0","Type":"ContainerStarted","Data":"fdd8b565275a73869aba0dace482f6ad395d7c9d13ffd827c4dfd53f39767e96"} Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.179699 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" event={"ID":"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513","Type":"ContainerStarted","Data":"f1fb21485888c6408efe6382b9ad0e01400a965e3e20a4bb5ea08ffa6332ecde"} Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.179732 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" event={"ID":"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513","Type":"ContainerStarted","Data":"4a8a88fe9fa050abb0479c637d1e4e232ab389aa2a939c4d9f3135fe99408731"} Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.182566 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-vdh4n" event={"ID":"a442ae21-7eff-4990-998f-27afcb839a6c","Type":"ContainerStarted","Data":"7556a74f62cfa03d7d6877cf0ac7b3fcefce33d8a21efd30fa182f4e53aac22e"} Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.182620 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-vdh4n" event={"ID":"a442ae21-7eff-4990-998f-27afcb839a6c","Type":"ContainerStarted","Data":"41b30eca4ad92a4f782208036b6ab2600632c2f03500e7afa53c584f6c389d34"} Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.187880 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" event={"ID":"68cdf6e7-fccd-4375-9688-7a2bcbefd82f","Type":"ContainerStarted","Data":"c4605d831a0d6b573d011919b36267673e4cb38084bde38f6c891ba555f51a65"} Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.191895 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" event={"ID":"9884d9ba-fbeb-40db-8105-de302262478b","Type":"ContainerStarted","Data":"187e6738540da05e067ea423b98d4efd3d217d04e0f2ac6b5c6cc37914224616"} Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.191934 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" event={"ID":"9884d9ba-fbeb-40db-8105-de302262478b","Type":"ContainerStarted","Data":"0d2a81b78273f088b81197ca78ec22abc78cc5fbdc1b3349477a5158392205a8"} Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.194244 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wlq8c" event={"ID":"94c19a90-c2c9-4236-98be-a0516dbb840b","Type":"ContainerStarted","Data":"5da9c7d414944f4328a1119aa21269f4a0ac5f86587a539468ca57ad8f87b24f"} Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.194268 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wlq8c" event={"ID":"94c19a90-c2c9-4236-98be-a0516dbb840b","Type":"ContainerStarted","Data":"95b886b5476f7b77861e9a6b7b021397c333f535a09578ba08ef153f2832cd25"} Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.195430 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" event={"ID":"bac29c51-c815-4827-bd3d-c74f2e31f842","Type":"ContainerStarted","Data":"d4ef373b94d181afedea5dabd5fee414987f702b031e6137da1403a3e31e3f9b"} Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.196912 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qljht\" (UniqueName: \"kubernetes.io/projected/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-kube-api-access-qljht\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.197906 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" event={"ID":"bd9a9e58-4363-4eaf-a27c-21aacaeea0a4","Type":"ContainerStarted","Data":"42db71085552419312922c644d5987206fd0d0ef12c21f3a93fedd2cdd98df1d"} Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.198003 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" event={"ID":"bd9a9e58-4363-4eaf-a27c-21aacaeea0a4","Type":"ContainerStarted","Data":"287c0d5e6ba5b835ad3f5bd309bb9680d421b30fd9be8e8a568092d6a603afd7"} Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.201325 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" event={"ID":"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0","Type":"ContainerStarted","Data":"701d96edd356dab1e9a8ae15bd01aef0cb8c6d8c29074e7d24c96c16a861b673"} Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.219288 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:40 crc kubenswrapper[5116]: E0322 00:10:40.219963 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:40.719941707 +0000 UTC m=+111.742243150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.219973 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkpc6\" (UniqueName: \"kubernetes.io/projected/44b1188c-0fa6-48c7-bf76-6e65ca8174ec-kube-api-access-xkpc6\") pod \"openshift-config-operator-5777786469-wb6r8\" (UID: \"44b1188c-0fa6-48c7-bf76-6e65ca8174ec\") " pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.239945 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsw9z\" (UniqueName: \"kubernetes.io/projected/4c2755ce-817d-47b0-9f19-7218641d0c5b-kube-api-access-zsw9z\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.248448 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.255902 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/720e1d69-81b3-4fdb-94c1-dabb0707c833-bound-sa-token\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.277207 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtlfq\" (UniqueName: \"kubernetes.io/projected/3bf5ae18-6e08-436b-939f-03347eda68a8-kube-api-access-wtlfq\") pod \"cluster-samples-operator-6b564684c8-dfx6t\" (UID: \"3bf5ae18-6e08-436b-939f-03347eda68a8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-dfx6t" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.301551 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqc8c\" (UniqueName: \"kubernetes.io/projected/720e1d69-81b3-4fdb-94c1-dabb0707c833-kube-api-access-zqc8c\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.301918 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.307634 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.315197 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv46d\" (UniqueName: \"kubernetes.io/projected/413bd8dc-5257-4fd7-95c1-01f6d79278ee-kube-api-access-tv46d\") pod \"openshift-controller-manager-operator-686468bdd5-pdqfv\" (UID: \"413bd8dc-5257-4fd7-95c1-01f6d79278ee\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.321344 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:40 crc kubenswrapper[5116]: E0322 00:10:40.321737 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:40.821708005 +0000 UTC m=+111.844009378 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.325890 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:40 crc kubenswrapper[5116]: E0322 00:10:40.326483 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:40.826469247 +0000 UTC m=+111.848770630 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.335330 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-95r42\" (UniqueName: \"kubernetes.io/projected/b4117709-89bd-4e72-8016-0c25c0ece2c6-kube-api-access-95r42\") pod \"console-operator-67c89758df-vnd4f\" (UID: \"b4117709-89bd-4e72-8016-0c25c0ece2c6\") " pod="openshift-console-operator/console-operator-67c89758df-vnd4f" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.369130 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-bound-sa-token\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.377861 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktdtc\" (UniqueName: \"kubernetes.io/projected/fe48c9c2-8783-475b-a961-d5a4110cb452-kube-api-access-ktdtc\") pod \"olm-operator-5cdf44d969-cp4p2\" (UID: \"fe48c9c2-8783-475b-a961-d5a4110cb452\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.397364 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rjsf\" (UniqueName: \"kubernetes.io/projected/93e67bda-2839-4364-9a75-54864090dc1f-kube-api-access-4rjsf\") pod \"kube-storage-version-migrator-operator-565b79b866-tlh4v\" (UID: \"93e67bda-2839-4364-9a75-54864090dc1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-tlh4v" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.409431 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-tlh4v" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.422740 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mkbf\" (UniqueName: \"kubernetes.io/projected/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-kube-api-access-7mkbf\") pod \"cni-sysctl-allowlist-ds-57nbs\" (UID: \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.427307 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:40 crc kubenswrapper[5116]: E0322 00:10:40.427439 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:40.92742261 +0000 UTC m=+111.949723983 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.427566 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:40 crc kubenswrapper[5116]: E0322 00:10:40.427907 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:40.927900356 +0000 UTC m=+111.950201729 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.440102 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xgch\" (UniqueName: \"kubernetes.io/projected/5df5e2ea-4661-49c7-95f2-d8c039bbea5d-kube-api-access-5xgch\") pod \"service-ca-operator-5b9c976747-zvprn\" (UID: \"5df5e2ea-4661-49c7-95f2-d8c039bbea5d\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-zvprn" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.460649 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.465105 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxs7g\" (UniqueName: \"kubernetes.io/projected/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-kube-api-access-hxs7g\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.477435 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.499255 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-psgc4\" (UniqueName: \"kubernetes.io/projected/50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee-kube-api-access-psgc4\") pod \"machine-config-operator-67c9d58cbb-l9f6f\" (UID: \"50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.500837 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-5777786469-wb6r8"] Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.502043 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/56318568-cab8-4d5b-9a20-4531fc8aad60-kube-api-access\") pod \"openshift-kube-scheduler-operator-54f497555d-lmct6\" (UID: \"56318568-cab8-4d5b-9a20-4531fc8aad60\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.508568 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-zvprn" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.526620 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c7b12ebb-b568-4d15-abde-14db5041d5d2-bound-sa-token\") pod \"ingress-operator-6b9cb4dbcf-qdtdp\" (UID: \"c7b12ebb-b568-4d15-abde-14db5041d5d2\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.528763 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-dfx6t" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.529187 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:40 crc kubenswrapper[5116]: E0322 00:10:40.529427 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:41.029410826 +0000 UTC m=+112.051712199 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.536794 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-67c89758df-vnd4f" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.539116 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xp2b\" (UniqueName: \"kubernetes.io/projected/dcd47220-17b3-4593-9a12-fb67c0c0dcc8-kube-api-access-6xp2b\") pod \"ingress-canary-55dml\" (UID: \"dcd47220-17b3-4593-9a12-fb67c0c0dcc8\") " pod="openshift-ingress-canary/ingress-canary-55dml" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.552629 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg"] Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.555601 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vqx7\" (UniqueName: \"kubernetes.io/projected/a224637e-e693-4ae7-89c3-1a01e6c9a6f5-kube-api-access-9vqx7\") pod \"packageserver-7d4fc7d867-td5gr\" (UID: \"a224637e-e693-4ae7-89c3-1a01e6c9a6f5\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.560411 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.560470 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.573652 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzqjt\" (UniqueName: \"kubernetes.io/projected/fc0602bb-5acd-426a-b3d6-3a2effb49bf3-kube-api-access-gzqjt\") pod \"package-server-manager-77f986bd66-ms24k\" (UID: \"fc0602bb-5acd-426a-b3d6-3a2effb49bf3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-ms24k" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.592757 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-66458b6674-8qfhd"] Mar 22 00:10:40 crc kubenswrapper[5116]: W0322 00:10:40.598199 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44b1188c_0fa6_48c7_bf76_6e65ca8174ec.slice/crio-d1e144ef7003c7250e36bbf914ac057a66e10f25b9b8f14f72333604bf394adb WatchSource:0}: Error finding container d1e144ef7003c7250e36bbf914ac057a66e10f25b9b8f14f72333604bf394adb: Status 404 returned error can't find the container with id d1e144ef7003c7250e36bbf914ac057a66e10f25b9b8f14f72333604bf394adb Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.598615 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lck5k\" (UniqueName: \"kubernetes.io/projected/a1258288-8146-4cba-9d66-2a88e35a1fe9-kube-api-access-lck5k\") pod \"catalog-operator-75ff9f647d-47j6l\" (UID: \"a1258288-8146-4cba-9d66-2a88e35a1fe9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" Mar 22 00:10:40 crc kubenswrapper[5116]: W0322 00:10:40.600302 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod720e1d69_81b3_4fdb_94c1_dabb0707c833.slice/crio-21093a88dfe98cf8a3593ef423b3139b9e015df32f65c9a38c0fac090f1bfe31 WatchSource:0}: Error finding container 21093a88dfe98cf8a3593ef423b3139b9e015df32f65c9a38c0fac090f1bfe31: Status 404 returned error can't find the container with id 21093a88dfe98cf8a3593ef423b3139b9e015df32f65c9a38c0fac090f1bfe31 Mar 22 00:10:40 crc kubenswrapper[5116]: W0322 00:10:40.604385 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73ebea9b_fc7b_4d54_af53_f6f61e0fce97.slice/crio-541bc11e49e0b922c8da02fa30a7c9193a344bc13bc63367f3137d8c058690fb WatchSource:0}: Error finding container 541bc11e49e0b922c8da02fa30a7c9193a344bc13bc63367f3137d8c058690fb: Status 404 returned error can't find the container with id 541bc11e49e0b922c8da02fa30a7c9193a344bc13bc63367f3137d8c058690fb Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.615803 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttvwk\" (UniqueName: \"kubernetes.io/projected/c7b12ebb-b568-4d15-abde-14db5041d5d2-kube-api-access-ttvwk\") pod \"ingress-operator-6b9cb4dbcf-qdtdp\" (UID: \"c7b12ebb-b568-4d15-abde-14db5041d5d2\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.630965 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:40 crc kubenswrapper[5116]: E0322 00:10:40.631350 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:41.131334251 +0000 UTC m=+112.153635624 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.633795 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkbsf\" (UniqueName: \"kubernetes.io/projected/bd4c0dd5-6c39-4661-b0c6-424d6b061f04-kube-api-access-fkbsf\") pod \"service-ca-74545575db-z8df4\" (UID: \"bd4c0dd5-6c39-4661-b0c6-424d6b061f04\") " pod="openshift-service-ca/service-ca-74545575db-z8df4" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.639795 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.654460 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clsgn\" (UniqueName: \"kubernetes.io/projected/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-kube-api-access-clsgn\") pod \"collect-profiles-29568960-fmcxl\" (UID: \"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.662941 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.675798 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bmdw\" (UniqueName: \"kubernetes.io/projected/00dffd10-d567-431f-8dd9-390443f26d96-kube-api-access-8bmdw\") pod \"router-default-68cf44c8b8-2jlxw\" (UID: \"00dffd10-d567-431f-8dd9-390443f26d96\") " pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.677355 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-ms24k" Mar 22 00:10:40 crc kubenswrapper[5116]: W0322 00:10:40.693440 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d2a94f_b4d4_4cdc_b862_a4866cadaea1.slice/crio-03b66e449ae75da9ca6d01567116f2bb706bdaf3b8c1863bf017db47b855716d WatchSource:0}: Error finding container 03b66e449ae75da9ca6d01567116f2bb706bdaf3b8c1863bf017db47b855716d: Status 404 returned error can't find the container with id 03b66e449ae75da9ca6d01567116f2bb706bdaf3b8c1863bf017db47b855716d Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.716895 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6g2r\" (UniqueName: \"kubernetes.io/projected/73928a0d-7a97-4c03-a5e8-6ab37119261c-kube-api-access-p6g2r\") pod \"migrator-866fcbc849-m5rc6\" (UID: \"73928a0d-7a97-4c03-a5e8-6ab37119261c\") " pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-m5rc6" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.722194 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-prql2\" (UniqueName: \"kubernetes.io/projected/0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a-kube-api-access-prql2\") pod \"dns-default-rkswl\" (UID: \"0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a\") " pod="openshift-dns/dns-default-rkswl" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.732744 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:40 crc kubenswrapper[5116]: E0322 00:10:40.733417 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:41.233395199 +0000 UTC m=+112.255696572 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.734715 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.747315 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.747403 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg9xh\" (UniqueName: \"kubernetes.io/projected/65b5ebf3-054c-4827-96b2-7ea0a26f20af-kube-api-access-sg9xh\") pod \"dns-operator-799b87ffcd-42tp2\" (UID: \"65b5ebf3-054c-4827-96b2-7ea0a26f20af\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-42tp2" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.755253 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-m5rc6" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.763916 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.772078 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-crwp9\" (UniqueName: \"kubernetes.io/projected/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-kube-api-access-crwp9\") pod \"marketplace-operator-547dbd544d-lf2zm\" (UID: \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.781544 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcvxk\" (UniqueName: \"kubernetes.io/projected/0910a8a1-0226-42c8-ab1d-b142d2b7a00d-kube-api-access-mcvxk\") pod \"control-plane-machine-set-operator-75ffdb6fcd-m65lb\" (UID: \"0910a8a1-0226-42c8-ab1d-b142d2b7a00d\") " pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m65lb" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.781797 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.791191 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-74545575db-z8df4" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.800437 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m65lb" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.802239 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf2fs\" (UniqueName: \"kubernetes.io/projected/dc9230ca-3a51-4ee3-976c-38c27605db87-kube-api-access-lf2fs\") pod \"machine-config-server-r75lk\" (UID: \"dc9230ca-3a51-4ee3-976c-38c27605db87\") " pod="openshift-machine-config-operator/machine-config-server-r75lk" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.820461 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-r75lk" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.820544 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.828647 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-55dml" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.834272 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:40 crc kubenswrapper[5116]: E0322 00:10:40.834696 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:41.334677753 +0000 UTC m=+112.356979126 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.835095 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.835387 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf866f04-6739-40da-8c1c-36d192472220-kube-api-access\") pod \"kube-apiserver-operator-575994946d-jxbnl\" (UID: \"cf866f04-6739-40da-8c1c-36d192472220\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.842929 5116 request.go:752] "Waited before sending request" delay="1.429325464s" reason="client-side throttling, not priority and fairness" verb="POST" URL="https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/serviceaccounts/multus-ac/token" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.845663 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4fhd\" (UniqueName: \"kubernetes.io/projected/1bb9e03f-ef85-4dbe-802f-d529e97b092c-kube-api-access-h4fhd\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.852483 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.859695 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64d44f6ddf-9g5sg"] Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.867601 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-52j77\" (UniqueName: \"kubernetes.io/projected/bd443947-7241-49e3-9d98-f55329818dcc-kube-api-access-52j77\") pod \"multus-admission-controller-69db94689b-bkst6\" (UID: \"bd443947-7241-49e3-9d98-f55329818dcc\") " pod="openshift-multus/multus-admission-controller-69db94689b-bkst6" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.872514 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rkswl" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.878510 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcwsm\" (UniqueName: \"kubernetes.io/projected/0eea7869-af21-4009-856f-65219d64ceea-kube-api-access-rcwsm\") pod \"machine-config-controller-f9cdd68f7-wfffg\" (UID: \"0eea7869-af21-4009-856f-65219d64ceea\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.897754 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2512f5ef-a611-4637-b41f-41185def421b-kube-api-access\") pod \"kube-controller-manager-operator-69d5f845f8-sk5s4\" (UID: \"2512f5ef-a611-4637-b41f-41185def421b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.925074 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-799b87ffcd-42tp2" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.931386 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.935541 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:40 crc kubenswrapper[5116]: E0322 00:10:40.935850 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:41.435834073 +0000 UTC m=+112.458135446 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.945693 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.955018 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.993342 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-69db94689b-bkst6" Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.023713 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg" Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.036605 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:41 crc kubenswrapper[5116]: E0322 00:10:41.036897 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:41.53688617 +0000 UTC m=+112.559187533 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:41 crc kubenswrapper[5116]: W0322 00:10:41.113606 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00dffd10_d567_431f_8dd9_390443f26d96.slice/crio-5c5f21dd0fb1f5cca22c0f6633d75243cf104676fd61cc14c483a6ebe77be37f WatchSource:0}: Error finding container 5c5f21dd0fb1f5cca22c0f6633d75243cf104676fd61cc14c483a6ebe77be37f: Status 404 returned error can't find the container with id 5c5f21dd0fb1f5cca22c0f6633d75243cf104676fd61cc14c483a6ebe77be37f Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.127007 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv"] Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.137813 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:41 crc kubenswrapper[5116]: E0322 00:10:41.137884 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:41.637866644 +0000 UTC m=+112.660168027 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.138196 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:41 crc kubenswrapper[5116]: E0322 00:10:41.138546 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:41.638528965 +0000 UTC m=+112.660830328 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.207903 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" event={"ID":"5f51f3b4-6887-42b5-ad77-5a2f349a162a","Type":"ContainerStarted","Data":"9114a650032efd74a5e2a932e1a4e1b3463dd65c69ea801f51e4fffd08b48235"} Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.208200 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.208904 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-r75lk" event={"ID":"dc9230ca-3a51-4ee3-976c-38c27605db87","Type":"ContainerStarted","Data":"efffb0ad943198a07230309dd53c994c1411da589dbe123311cafbb9e5fa0172"} Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.210430 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" event={"ID":"9884d9ba-fbeb-40db-8105-de302262478b","Type":"ContainerStarted","Data":"9bfbd0761ac53ac1129a6e95695fa17cb6457e56c7d321788eea8e02c8759ed4"} Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.213708 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wlq8c" event={"ID":"94c19a90-c2c9-4236-98be-a0516dbb840b","Type":"ContainerStarted","Data":"94524cbda17a754b2c962f246991ddd1a68e24324f4f27eb2c33b803e8cab1ef"} Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.218227 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" event={"ID":"44b1188c-0fa6-48c7-bf76-6e65ca8174ec","Type":"ContainerStarted","Data":"d1e144ef7003c7250e36bbf914ac057a66e10f25b9b8f14f72333604bf394adb"} Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.219658 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" event={"ID":"bac29c51-c815-4827-bd3d-c74f2e31f842","Type":"ContainerStarted","Data":"28d79fc4790c4f02ffe53e980b29154ae8315112a1d282830854ede66cb399e2"} Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.221818 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" event={"ID":"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1","Type":"ContainerStarted","Data":"03b66e449ae75da9ca6d01567116f2bb706bdaf3b8c1863bf017db47b855716d"} Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.230425 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" event={"ID":"720e1d69-81b3-4fdb-94c1-dabb0707c833","Type":"ContainerStarted","Data":"21093a88dfe98cf8a3593ef423b3139b9e015df32f65c9a38c0fac090f1bfe31"} Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.233325 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" event={"ID":"73ebea9b-fc7b-4d54-af53-f6f61e0fce97","Type":"ContainerStarted","Data":"541bc11e49e0b922c8da02fa30a7c9193a344bc13bc63367f3137d8c058690fb"} Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.235229 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29568960-tjk88" event={"ID":"f8b3ade2-2521-43a2-a5fc-2c33d19f3a58","Type":"ContainerStarted","Data":"1d4ffcdcf7f3c1ceaea89d564eaa99a17f05170fe4b5407ce1655d672a0e5a4e"} Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.237260 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" event={"ID":"00dffd10-d567-431f-8dd9-390443f26d96","Type":"ContainerStarted","Data":"5c5f21dd0fb1f5cca22c0f6633d75243cf104676fd61cc14c483a6ebe77be37f"} Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.239027 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:41 crc kubenswrapper[5116]: E0322 00:10:41.239403 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:41.739361275 +0000 UTC m=+112.761662658 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.240942 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.240947 5116 generic.go:358] "Generic (PLEG): container finished" podID="68cdf6e7-fccd-4375-9688-7a2bcbefd82f" containerID="38b12cab203858c4c797989bc1b13b2ea864137e8626e042e66c979fb3b26066" exitCode=0 Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.240976 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" event={"ID":"68cdf6e7-fccd-4375-9688-7a2bcbefd82f","Type":"ContainerDied","Data":"38b12cab203858c4c797989bc1b13b2ea864137e8626e042e66c979fb3b26066"} Mar 22 00:10:41 crc kubenswrapper[5116]: E0322 00:10:41.241440 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:41.741363198 +0000 UTC m=+112.763664651 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.244282 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64d44f6ddf-9g5sg" event={"ID":"4c2755ce-817d-47b0-9f19-7218641d0c5b","Type":"ContainerStarted","Data":"dbec98cf4e43e15ff41d5a49916dbfd9285ace92f1c8184b77f7389222a14b01"} Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.247054 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" event={"ID":"bd9a9e58-4363-4eaf-a27c-21aacaeea0a4","Type":"ContainerStarted","Data":"89bebe35c6dbb2c0b52a587df591e5f99263e1302c9ca5663a1b8462c04e9fb6"} Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.250003 5116 generic.go:358] "Generic (PLEG): container finished" podID="8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0" containerID="aebe47f61ff1b229e280099427e2808f24b0724cc7faa10d582dbd873dfaedd3" exitCode=0 Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.250104 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" event={"ID":"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0","Type":"ContainerDied","Data":"aebe47f61ff1b229e280099427e2808f24b0724cc7faa10d582dbd873dfaedd3"} Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.278724 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.278924 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-747b44746d-cb5p2" Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.284881 5116 patch_prober.go:28] interesting pod/route-controller-manager-776cdc94d6-fw5k5 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.284961 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" podUID="5f51f3b4-6887-42b5-ad77-5a2f349a162a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.285404 5116 patch_prober.go:28] interesting pod/controller-manager-65b6cccf98-9kdkj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.285470 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" podUID="ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.299670 5116 patch_prober.go:28] interesting pod/downloads-747b44746d-cb5p2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.299745 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-747b44746d-cb5p2" podUID="45fa64e1-27bb-4b1f-bf62-4fa08b5dcfa0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.344301 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:41 crc kubenswrapper[5116]: E0322 00:10:41.346357 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:41.846336289 +0000 UTC m=+112.868637662 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.410202 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" podStartSLOduration=91.410124514 podStartE2EDuration="1m31.410124514s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:41.351098181 +0000 UTC m=+112.373399554" watchObservedRunningTime="2026-03-22 00:10:41.410124514 +0000 UTC m=+112.432425897" Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.446389 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:41 crc kubenswrapper[5116]: E0322 00:10:41.446794 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:41.946777027 +0000 UTC m=+112.969078400 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.469044 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" podStartSLOduration=91.469030673 podStartE2EDuration="1m31.469030673s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:41.437936146 +0000 UTC m=+112.460237539" watchObservedRunningTime="2026-03-22 00:10:41.469030673 +0000 UTC m=+112.491332046" Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.547415 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:41 crc kubenswrapper[5116]: E0322 00:10:41.548313 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:42.048291287 +0000 UTC m=+113.070592660 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.653352 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:41 crc kubenswrapper[5116]: E0322 00:10:41.653683 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:42.153669812 +0000 UTC m=+113.175971185 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.719138 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-wlq8c" podStartSLOduration=91.719118928 podStartE2EDuration="1m31.719118928s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:41.715260986 +0000 UTC m=+112.737562379" watchObservedRunningTime="2026-03-22 00:10:41.719118928 +0000 UTC m=+112.741420321" Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.754730 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:41 crc kubenswrapper[5116]: E0322 00:10:41.754919 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:42.254882453 +0000 UTC m=+113.277183826 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.755287 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:41 crc kubenswrapper[5116]: E0322 00:10:41.755662 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:42.255653337 +0000 UTC m=+113.277954710 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.857283 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:41 crc kubenswrapper[5116]: E0322 00:10:41.857875 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:42.357855731 +0000 UTC m=+113.380157114 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.960323 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:41 crc kubenswrapper[5116]: E0322 00:10:41.961016 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:42.460998833 +0000 UTC m=+113.483300206 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.061697 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:42 crc kubenswrapper[5116]: E0322 00:10:42.062194 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:42.562176924 +0000 UTC m=+113.584478297 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.107607 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-tlh4v"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.118769 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-dfx6t"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.138119 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2"] Mar 22 00:10:42 crc kubenswrapper[5116]: W0322 00:10:42.147390 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe48c9c2_8783_475b_a961_d5a4110cb452.slice/crio-f64f0c7f6380e0bbf5ee6de764ef3f133abf565b5e25e6d3558e99c1a0b86486 WatchSource:0}: Error finding container f64f0c7f6380e0bbf5ee6de764ef3f133abf565b5e25e6d3558e99c1a0b86486: Status 404 returned error can't find the container with id f64f0c7f6380e0bbf5ee6de764ef3f133abf565b5e25e6d3558e99c1a0b86486 Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.163962 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:42 crc kubenswrapper[5116]: E0322 00:10:42.164710 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:42.664696327 +0000 UTC m=+113.686997700 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.264947 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:42 crc kubenswrapper[5116]: E0322 00:10:42.265376 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:42.765353721 +0000 UTC m=+113.787655094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.275733 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-r75lk" event={"ID":"dc9230ca-3a51-4ee3-976c-38c27605db87","Type":"ContainerStarted","Data":"bd24c60ce4d85de35c70b0f8332cafa2c6cb13a90f7c511a6a9293a96a00baf1"} Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.282795 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-vdh4n" podStartSLOduration=93.282770213 podStartE2EDuration="1m33.282770213s" podCreationTimestamp="2026-03-22 00:09:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:42.274724078 +0000 UTC m=+113.297025461" watchObservedRunningTime="2026-03-22 00:10:42.282770213 +0000 UTC m=+113.305071596" Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.311997 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" event={"ID":"720e1d69-81b3-4fdb-94c1-dabb0707c833","Type":"ContainerStarted","Data":"73a12da3b3bc7773cfec2bcaf6ea449dce19d7297c195ed650b9612c81288a8e"} Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.322538 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-ms24k"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.332081 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" event={"ID":"fe48c9c2-8783-475b-a961-d5a4110cb452","Type":"ContainerStarted","Data":"f64f0c7f6380e0bbf5ee6de764ef3f133abf565b5e25e6d3558e99c1a0b86486"} Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.336485 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-5b9c976747-zvprn"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.354882 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.363385 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.368025 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:42 crc kubenswrapper[5116]: E0322 00:10:42.368943 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:42.868930437 +0000 UTC m=+113.891231810 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.378500 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv" event={"ID":"413bd8dc-5257-4fd7-95c1-01f6d79278ee","Type":"ContainerStarted","Data":"723176f625a94af34283d0f388978d0c12b4c1bf21948c21dc6d8168fb69989f"} Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.379058 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rkswl"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.390380 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" event={"ID":"00dffd10-d567-431f-8dd9-390443f26d96","Type":"ContainerStarted","Data":"1760d0cb129bddc97f7c1c542eab1158bf68b95b799269e9acc9c36272c6d24e"} Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.405907 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-tlh4v" event={"ID":"93e67bda-2839-4364-9a75-54864090dc1f","Type":"ContainerStarted","Data":"c748392e8bdddae09e73c1aec7abac3ee2017ebac04d6f15c62d3c0ad4fc8ea2"} Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.429012 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-74545575db-z8df4"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.437073 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64d44f6ddf-9g5sg" event={"ID":"4c2755ce-817d-47b0-9f19-7218641d0c5b","Type":"ContainerStarted","Data":"d31637ad8e7ff373cab181373fc2ece79b732258d319b32c6a963d0b07a6c56a"} Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.450580 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-67c89758df-vnd4f"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.451230 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-55dml"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.466710 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m65lb"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.469488 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:42 crc kubenswrapper[5116]: E0322 00:10:42.470834 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:42.970816461 +0000 UTC m=+113.993117834 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:42 crc kubenswrapper[5116]: W0322 00:10:42.489151 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd4c0dd5_6c39_4661_b0c6_424d6b061f04.slice/crio-a832625cfa0643de09d04cd2785bd616383620f4f27e61a8788477cb8e277991 WatchSource:0}: Error finding container a832625cfa0643de09d04cd2785bd616383620f4f27e61a8788477cb8e277991: Status 404 returned error can't find the container with id a832625cfa0643de09d04cd2785bd616383620f4f27e61a8788477cb8e277991 Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.494068 5116 ???:1] "http: TLS handshake error from 192.168.126.11:55450: no serving certificate available for the kubelet" Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.499001 5116 patch_prober.go:28] interesting pod/downloads-747b44746d-cb5p2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.499143 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-747b44746d-cb5p2" podUID="45fa64e1-27bb-4b1f-bf62-4fa08b5dcfa0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Mar 22 00:10:42 crc kubenswrapper[5116]: W0322 00:10:42.511694 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4117709_89bd_4e72_8016_0c25c0ece2c6.slice/crio-ee29cde6b79f9126ff7254db86af5e5422a58b97f994568cce87584f05b97cc0 WatchSource:0}: Error finding container ee29cde6b79f9126ff7254db86af5e5422a58b97f994568cce87584f05b97cc0: Status 404 returned error can't find the container with id ee29cde6b79f9126ff7254db86af5e5422a58b97f994568cce87584f05b97cc0 Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.575878 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:42 crc kubenswrapper[5116]: E0322 00:10:42.576250 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:43.076233155 +0000 UTC m=+114.098534528 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.610985 5116 ???:1] "http: TLS handshake error from 192.168.126.11:55466: no serving certificate available for the kubelet" Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.654905 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.658496 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" podStartSLOduration=92.658478925 podStartE2EDuration="1m32.658478925s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:42.657868776 +0000 UTC m=+113.680170149" watchObservedRunningTime="2026-03-22 00:10:42.658478925 +0000 UTC m=+113.680780298" Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.673036 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.679727 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:42 crc kubenswrapper[5116]: E0322 00:10:42.680373 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:43.180353559 +0000 UTC m=+114.202654932 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.689917 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-lf2zm"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.705641 5116 ???:1] "http: TLS handshake error from 192.168.126.11:55468: no serving certificate available for the kubelet" Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.723979 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-799b87ffcd-42tp2"] Mar 22 00:10:42 crc kubenswrapper[5116]: W0322 00:10:42.728459 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1258288_8146_4cba_9d66_2a88e35a1fe9.slice/crio-9d8ced1bc6d028d7d194ab8a7ffc574ef4b83cb030688c5fdf0c163bedfc4755 WatchSource:0}: Error finding container 9d8ced1bc6d028d7d194ab8a7ffc574ef4b83cb030688c5fdf0c163bedfc4755: Status 404 returned error can't find the container with id 9d8ced1bc6d028d7d194ab8a7ffc574ef4b83cb030688c5fdf0c163bedfc4755 Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.751938 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.777792 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-866fcbc849-m5rc6"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.782388 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:42 crc kubenswrapper[5116]: E0322 00:10:42.782743 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:43.282720657 +0000 UTC m=+114.305022030 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.803457 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.805867 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-mzck5"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.810366 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-69db94689b-bkst6"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.811547 5116 ???:1] "http: TLS handshake error from 192.168.126.11:55474: no serving certificate available for the kubelet" Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.812730 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.819067 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.845585 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.866971 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.870767 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.874773 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-64d44f6ddf-9g5sg" podStartSLOduration=92.874752378 podStartE2EDuration="1m32.874752378s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:42.869235473 +0000 UTC m=+113.891536866" watchObservedRunningTime="2026-03-22 00:10:42.874752378 +0000 UTC m=+113.897053751" Mar 22 00:10:42 crc kubenswrapper[5116]: W0322 00:10:42.880991 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2512f5ef_a611_4637_b41f_41185def421b.slice/crio-c850a5b15ebe369168442d8759d8d793df9b63902d501fe1424f01606db18a12 WatchSource:0}: Error finding container c850a5b15ebe369168442d8759d8d793df9b63902d501fe1424f01606db18a12: Status 404 returned error can't find the container with id c850a5b15ebe369168442d8759d8d793df9b63902d501fe1424f01606db18a12 Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.886081 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:42 crc kubenswrapper[5116]: E0322 00:10:42.886283 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:43.386235832 +0000 UTC m=+114.408537195 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.886886 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:42 crc kubenswrapper[5116]: E0322 00:10:42.887509 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:43.387491862 +0000 UTC m=+114.409793235 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:42 crc kubenswrapper[5116]: W0322 00:10:42.888726 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod113ffd3f_0faf_40f9_b1ab_0c7b88fc90f1.slice/crio-dae1818a1082f52b271c6e67aa4b0cda2f80f759d34e5d7f0f380b84d59344e9 WatchSource:0}: Error finding container dae1818a1082f52b271c6e67aa4b0cda2f80f759d34e5d7f0f380b84d59344e9: Status 404 returned error can't find the container with id dae1818a1082f52b271c6e67aa4b0cda2f80f759d34e5d7f0f380b84d59344e9 Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.910253 5116 ???:1] "http: TLS handshake error from 192.168.126.11:55490: no serving certificate available for the kubelet" Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.942350 5116 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-2jlxw container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.942401 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" podUID="00dffd10-d567-431f-8dd9-390443f26d96" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.942982 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.988554 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:42 crc kubenswrapper[5116]: E0322 00:10:42.989380 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:43.489357395 +0000 UTC m=+114.511658768 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.995085 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-747b44746d-cb5p2" podStartSLOduration=92.995067216 podStartE2EDuration="1m32.995067216s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:42.991514233 +0000 UTC m=+114.013815626" watchObservedRunningTime="2026-03-22 00:10:42.995067216 +0000 UTC m=+114.017368599" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.043966 5116 ???:1] "http: TLS handshake error from 192.168.126.11:55496: no serving certificate available for the kubelet" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.090725 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:43 crc kubenswrapper[5116]: E0322 00:10:43.091368 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:43.591356481 +0000 UTC m=+114.613657854 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.131944 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" podStartSLOduration=93.131921209 podStartE2EDuration="1m33.131921209s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.128786429 +0000 UTC m=+114.151087802" watchObservedRunningTime="2026-03-22 00:10:43.131921209 +0000 UTC m=+114.154222592" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.132317 5116 ???:1] "http: TLS handshake error from 192.168.126.11:55502: no serving certificate available for the kubelet" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.136696 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29568960-tjk88" podStartSLOduration=94.136681279 podStartE2EDuration="1m34.136681279s" podCreationTimestamp="2026-03-22 00:09:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.056092582 +0000 UTC m=+114.078393955" watchObservedRunningTime="2026-03-22 00:10:43.136681279 +0000 UTC m=+114.158982652" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.149882 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" podStartSLOduration=93.149862857 podStartE2EDuration="1m33.149862857s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.147997358 +0000 UTC m=+114.170298741" watchObservedRunningTime="2026-03-22 00:10:43.149862857 +0000 UTC m=+114.172164230" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.166132 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" podStartSLOduration=95.166109353 podStartE2EDuration="1m35.166109353s" podCreationTimestamp="2026-03-22 00:09:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.163684366 +0000 UTC m=+114.185985739" watchObservedRunningTime="2026-03-22 00:10:43.166109353 +0000 UTC m=+114.188410726" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.181810 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.194917 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:43 crc kubenswrapper[5116]: E0322 00:10:43.195330 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:43.695309719 +0000 UTC m=+114.717611092 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.197483 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" podStartSLOduration=93.197465288 podStartE2EDuration="1m33.197465288s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.196476677 +0000 UTC m=+114.218778150" watchObservedRunningTime="2026-03-22 00:10:43.197465288 +0000 UTC m=+114.219766661" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.286544 5116 ???:1] "http: TLS handshake error from 192.168.126.11:55510: no serving certificate available for the kubelet" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.299643 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:43 crc kubenswrapper[5116]: E0322 00:10:43.300434 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:43.800416834 +0000 UTC m=+114.822718207 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.403474 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:43 crc kubenswrapper[5116]: E0322 00:10:43.403633 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:43.903591729 +0000 UTC m=+114.925893102 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.404752 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:43 crc kubenswrapper[5116]: E0322 00:10:43.405210 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:43.90519777 +0000 UTC m=+114.927499143 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.469426 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" event={"ID":"68cdf6e7-fccd-4375-9688-7a2bcbefd82f","Type":"ContainerStarted","Data":"aa58c1b4b605febfa81daefda67e613d9aa8f5b439bda030490238302b4fc814"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.476106 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" event={"ID":"a224637e-e693-4ae7-89c3-1a01e6c9a6f5","Type":"ContainerStarted","Data":"a8af25551d1890f12f0d47062adc0afaa25a755cba29680343f498bfb1020e02"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.489378 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" event={"ID":"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0","Type":"ContainerStarted","Data":"0ec1c1ec99b4ee4c79936b1b90ac85731d0a065a26aa6ac29f075ef6bdc7676a"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.491352 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl" event={"ID":"cf866f04-6739-40da-8c1c-36d192472220","Type":"ContainerStarted","Data":"cfd4dc4ddc61db053d22358e0a3c01f31a90898bf77c1791f70ef6e6eb889660"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.493925 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" event={"ID":"50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee","Type":"ContainerStarted","Data":"a3c9c27e9d2cc97bca372317fcf8d468530023234119b2a4fbe84defe8f015a8"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.496518 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg" event={"ID":"0eea7869-af21-4009-856f-65219d64ceea","Type":"ContainerStarted","Data":"27260d6135cfacddcdfca7e2b5f9de8c645628459b662e97af029712da6761eb"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.498006 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4" event={"ID":"2512f5ef-a611-4637-b41f-41185def421b","Type":"ContainerStarted","Data":"c850a5b15ebe369168442d8759d8d793df9b63902d501fe1424f01606db18a12"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.501769 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-m5rc6" event={"ID":"73928a0d-7a97-4c03-a5e8-6ab37119261c","Type":"ContainerStarted","Data":"dc3baf11f173de740ebc9e5638baaa55491861bf7fa3f9563de211a3294d2d8b"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.505791 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:43 crc kubenswrapper[5116]: E0322 00:10:43.505961 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:44.005928276 +0000 UTC m=+115.028229649 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.506348 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:43 crc kubenswrapper[5116]: E0322 00:10:43.506796 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:44.006757403 +0000 UTC m=+115.029058776 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.507567 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-dfx6t" event={"ID":"3bf5ae18-6e08-436b-939f-03347eda68a8","Type":"ContainerStarted","Data":"441a1789203e5ff9b4371c60983659510d2ce4e47c16b032b945bd3ef8532888"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.507629 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-dfx6t" event={"ID":"3bf5ae18-6e08-436b-939f-03347eda68a8","Type":"ContainerStarted","Data":"cb910c1ce7004c90694e7cb0f97d872914ab3ce97eecec864fde592b9d28f229"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.512898 5116 generic.go:358] "Generic (PLEG): container finished" podID="44b1188c-0fa6-48c7-bf76-6e65ca8174ec" containerID="e7b62cb886ebeacc39c91f1f6acfa8bb840da76b7903d79d0c7c1dfa267d370b" exitCode=0 Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.512962 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" event={"ID":"44b1188c-0fa6-48c7-bf76-6e65ca8174ec","Type":"ContainerDied","Data":"e7b62cb886ebeacc39c91f1f6acfa8bb840da76b7903d79d0c7c1dfa267d370b"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.518566 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-zvprn" event={"ID":"5df5e2ea-4661-49c7-95f2-d8c039bbea5d","Type":"ContainerStarted","Data":"35d6628d6171d1a0e5f1525c2fbbc4417fe95a5014f9ba89b2cbd6024dce0301"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.518608 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-zvprn" event={"ID":"5df5e2ea-4661-49c7-95f2-d8c039bbea5d","Type":"ContainerStarted","Data":"60a9ef36afbb727be95c8dc1084b633e0aac362b1138282f574e1f604d2ec0b7"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.520315 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" event={"ID":"fe48c9c2-8783-475b-a961-d5a4110cb452","Type":"ContainerStarted","Data":"00325fd390abfcb658e5387701cd8c98e8285bb75c8b6feaf9931447deb8e216"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.521831 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-69db94689b-bkst6" event={"ID":"bd443947-7241-49e3-9d98-f55329818dcc","Type":"ContainerStarted","Data":"edb740c8bef17d97c70456de5ee812ae7840e8ba280b1bce72e739c160de5438"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.523510 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6" event={"ID":"56318568-cab8-4d5b-9a20-4531fc8aad60","Type":"ContainerStarted","Data":"d3d87ef59ddfee5645a922c56a1a308caadd0cb655dd046c6c70ba66ced1f4e5"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.523534 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6" event={"ID":"56318568-cab8-4d5b-9a20-4531fc8aad60","Type":"ContainerStarted","Data":"d99035b44f8e9d301a6148212aba53b6f400724c4c4417749ae1e400595cc5ca"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.524990 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m65lb" event={"ID":"0910a8a1-0226-42c8-ab1d-b142d2b7a00d","Type":"ContainerStarted","Data":"2a9593d15ec672301e7be36642cd55d448e399a45c5270574baa719c0272c69d"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.525340 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m65lb" event={"ID":"0910a8a1-0226-42c8-ab1d-b142d2b7a00d","Type":"ContainerStarted","Data":"744e83dbeca747713e5bc106c8c1657865723be5795575912f982ea9219abf0c"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.530213 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-74545575db-z8df4" event={"ID":"bd4c0dd5-6c39-4661-b0c6-424d6b061f04","Type":"ContainerStarted","Data":"12c3c52a4b89ff3cdb4a058248706e793e77a03881ad1275cd96a1ef75655d21"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.530238 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-74545575db-z8df4" event={"ID":"bd4c0dd5-6c39-4661-b0c6-424d6b061f04","Type":"ContainerStarted","Data":"a832625cfa0643de09d04cd2785bd616383620f4f27e61a8788477cb8e277991"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.538444 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" event={"ID":"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4","Type":"ContainerStarted","Data":"18112439814edd26fc6f9feabf6054a0029c97046be0d71f72e725b224eebd6e"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.540129 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-55dml" event={"ID":"dcd47220-17b3-4593-9a12-fb67c0c0dcc8","Type":"ContainerStarted","Data":"48bc4f44ea89b3eb16e668295ea44a7a0368109fe874f3f4d2c54a972d3b88ac"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.540159 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-55dml" event={"ID":"dcd47220-17b3-4593-9a12-fb67c0c0dcc8","Type":"ContainerStarted","Data":"73630d9731f678fb90ce32c55553adde10ec0fb4b03b37d708406c8acaaee087"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.543434 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" event={"ID":"c7b12ebb-b568-4d15-abde-14db5041d5d2","Type":"ContainerStarted","Data":"56c33e03ce6f00b602f67882ebe171360b1da7dcfb34b6c4f3bfb4e1912f9cd3"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.543463 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" event={"ID":"c7b12ebb-b568-4d15-abde-14db5041d5d2","Type":"ContainerStarted","Data":"a49956e14fb6aa75f2f9c124c9e4161965fbc0b1c86e79ab712054c63185767d"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.545283 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" event={"ID":"23e39fb8-29b4-4a99-b189-3cd7c8e7f488","Type":"ContainerStarted","Data":"a6ce4d9e4fad533d2bf3efd803ec0a13bd8f8eba0dea6f48dfd40baebf8e74cc"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.545310 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" event={"ID":"23e39fb8-29b4-4a99-b189-3cd7c8e7f488","Type":"ContainerStarted","Data":"082b4604427fbc7c5d9bce23172c03602291dda6ccea1696f1f624d1746d3739"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.546828 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-ms24k" event={"ID":"fc0602bb-5acd-426a-b3d6-3a2effb49bf3","Type":"ContainerStarted","Data":"332dafd2662f48ec31bf3a0a888d939eecc32e94040c64906ee34183b65eb690"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.546854 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-ms24k" event={"ID":"fc0602bb-5acd-426a-b3d6-3a2effb49bf3","Type":"ContainerStarted","Data":"57c5e3a242c4b7c6908c9b7bb0828fa534c40b70be31283e92b4fbd2c7096def"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.567799 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-799b87ffcd-42tp2" event={"ID":"65b5ebf3-054c-4827-96b2-7ea0a26f20af","Type":"ContainerStarted","Data":"4d440f2dfcddc54fd556bfe43abeb04733e8a7617dca10952526a1edc3bc3c85"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.571278 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rkswl" event={"ID":"0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a","Type":"ContainerStarted","Data":"dd7d3ee59ad7adc7ee11f8b9aeb0f81d38fef91b3e02560f14a01b0f64ff8f1a"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.591267 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" event={"ID":"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1","Type":"ContainerStarted","Data":"a68815d0fdf2b53a6a0e9f938fc585a599e79f4f7ad52f8b29c63ba519d0d23e"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.593845 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" event={"ID":"73ebea9b-fc7b-4d54-af53-f6f61e0fce97","Type":"ContainerStarted","Data":"0eaa89fec503d9ea89bbf6737645ec15fbfea7b3c5aaafe399fe76d10fe522f5"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.600935 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv" event={"ID":"413bd8dc-5257-4fd7-95c1-01f6d79278ee","Type":"ContainerStarted","Data":"85f1aefdf5628b11ce519a901f58bba27ced3b6d1638a2fc598e624271f63382"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.605940 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" event={"ID":"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1","Type":"ContainerStarted","Data":"dae1818a1082f52b271c6e67aa4b0cda2f80f759d34e5d7f0f380b84d59344e9"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.608668 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-tlh4v" event={"ID":"93e67bda-2839-4364-9a75-54864090dc1f","Type":"ContainerStarted","Data":"9f662781c4eb461bb13ce5c5759b95a3882a62d90eac7634ad6302e955e47a44"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.608886 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:43 crc kubenswrapper[5116]: E0322 00:10:43.609012 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:44.108992776 +0000 UTC m=+115.131294139 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.609335 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:43 crc kubenswrapper[5116]: E0322 00:10:43.610049 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:44.110029529 +0000 UTC m=+115.132330902 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.611471 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mzck5" event={"ID":"1bb9e03f-ef85-4dbe-802f-d529e97b092c","Type":"ContainerStarted","Data":"cd9d650c3258cebfd58378de1572e0475377631fb57a20b94d8ed65a47d615be"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.614902 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" event={"ID":"a1258288-8146-4cba-9d66-2a88e35a1fe9","Type":"ContainerStarted","Data":"0b5fbe52956196216d6fe41d0239bf23cd3ebb86c349dfe490f8266571e7bea5"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.615084 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" event={"ID":"a1258288-8146-4cba-9d66-2a88e35a1fe9","Type":"ContainerStarted","Data":"9d8ced1bc6d028d7d194ab8a7ffc574ef4b83cb030688c5fdf0c163bedfc4755"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.617865 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-67c89758df-vnd4f" event={"ID":"b4117709-89bd-4e72-8016-0c25c0ece2c6","Type":"ContainerStarted","Data":"db8eaf8089a615a1169629a8ceff9d51faff3f489ba7e50a66264a1bc3dd2bc6"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.617901 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-67c89758df-vnd4f" event={"ID":"b4117709-89bd-4e72-8016-0c25c0ece2c6","Type":"ContainerStarted","Data":"ee29cde6b79f9126ff7254db86af5e5422a58b97f994568cce87584f05b97cc0"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.645478 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.655219 5116 patch_prober.go:28] interesting pod/oauth-openshift-66458b6674-8qfhd container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.27:6443/healthz\": dial tcp 10.217.0.27:6443: connect: connection refused" start-of-body= Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.655288 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" podUID="73ebea9b-fc7b-4d54-af53-f6f61e0fce97" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.27:6443/healthz\": dial tcp 10.217.0.27:6443: connect: connection refused" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.663639 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" podStartSLOduration=93.663594959 podStartE2EDuration="1m33.663594959s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.663432704 +0000 UTC m=+114.685734107" watchObservedRunningTime="2026-03-22 00:10:43.663594959 +0000 UTC m=+114.685896332" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.687357 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" podStartSLOduration=93.687338782 podStartE2EDuration="1m33.687338782s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.681157746 +0000 UTC m=+114.703459139" watchObservedRunningTime="2026-03-22 00:10:43.687338782 +0000 UTC m=+114.709640155" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.706956 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" podStartSLOduration=93.706937874 podStartE2EDuration="1m33.706937874s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.702972959 +0000 UTC m=+114.725274342" watchObservedRunningTime="2026-03-22 00:10:43.706937874 +0000 UTC m=+114.729239247" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.712023 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:43 crc kubenswrapper[5116]: E0322 00:10:43.712533 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:44.212512011 +0000 UTC m=+115.234813384 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.713408 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.713413 5116 patch_prober.go:28] interesting pod/olm-operator-5cdf44d969-cp4p2 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.713471 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" podUID="fe48c9c2-8783-475b-a961-d5a4110cb452" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Mar 22 00:10:43 crc kubenswrapper[5116]: E0322 00:10:43.715135 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:44.215114343 +0000 UTC m=+115.237415716 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.723054 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv" podStartSLOduration=93.723035726 podStartE2EDuration="1m33.723035726s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.720972129 +0000 UTC m=+114.743273502" watchObservedRunningTime="2026-03-22 00:10:43.723035726 +0000 UTC m=+114.745337099" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.750634 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-zvprn" podStartSLOduration=93.75061041 podStartE2EDuration="1m33.75061041s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.740857841 +0000 UTC m=+114.763159214" watchObservedRunningTime="2026-03-22 00:10:43.75061041 +0000 UTC m=+114.772911803" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.773687 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.783103 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" podStartSLOduration=93.78308171 podStartE2EDuration="1m33.78308171s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.761874558 +0000 UTC m=+114.784175931" watchObservedRunningTime="2026-03-22 00:10:43.78308171 +0000 UTC m=+114.805383083" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.783233 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-r75lk" podStartSLOduration=6.783227195 podStartE2EDuration="6.783227195s" podCreationTimestamp="2026-03-22 00:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.780744916 +0000 UTC m=+114.803046289" watchObservedRunningTime="2026-03-22 00:10:43.783227195 +0000 UTC m=+114.805528598" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.800886 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-tlh4v" podStartSLOduration=93.800873215 podStartE2EDuration="1m33.800873215s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.800530564 +0000 UTC m=+114.822831947" watchObservedRunningTime="2026-03-22 00:10:43.800873215 +0000 UTC m=+114.823174588" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.823145 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:43 crc kubenswrapper[5116]: E0322 00:10:43.825312 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:44.32528385 +0000 UTC m=+115.347585233 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.883083 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m65lb" podStartSLOduration=93.883065613 podStartE2EDuration="1m33.883065613s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.850223771 +0000 UTC m=+114.872525164" watchObservedRunningTime="2026-03-22 00:10:43.883065613 +0000 UTC m=+114.905366976" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.883191 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-67c89758df-vnd4f" podStartSLOduration=93.883185757 podStartE2EDuration="1m33.883185757s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.878123796 +0000 UTC m=+114.900425179" watchObservedRunningTime="2026-03-22 00:10:43.883185757 +0000 UTC m=+114.905487130" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.926805 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:43 crc kubenswrapper[5116]: E0322 00:10:43.927225 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:44.427209564 +0000 UTC m=+115.449510937 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.935806 5116 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-2jlxw container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.935888 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" podUID="00dffd10-d567-431f-8dd9-390443f26d96" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.953502 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" podStartSLOduration=6.953483307 podStartE2EDuration="6.953483307s" podCreationTimestamp="2026-03-22 00:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.92172023 +0000 UTC m=+114.944021623" watchObservedRunningTime="2026-03-22 00:10:43.953483307 +0000 UTC m=+114.975784680" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.961884 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6" podStartSLOduration=93.961863333 podStartE2EDuration="1m33.961863333s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.95955265 +0000 UTC m=+114.981854023" watchObservedRunningTime="2026-03-22 00:10:43.961863333 +0000 UTC m=+114.984164716" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.965760 5116 ???:1] "http: TLS handshake error from 192.168.126.11:55512: no serving certificate available for the kubelet" Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.015859 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-74545575db-z8df4" podStartSLOduration=94.015844486 podStartE2EDuration="1m34.015844486s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.982902081 +0000 UTC m=+115.005203454" watchObservedRunningTime="2026-03-22 00:10:44.015844486 +0000 UTC m=+115.038145849" Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.017811 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" podStartSLOduration=94.017802158 podStartE2EDuration="1m34.017802158s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:44.014441962 +0000 UTC m=+115.036743355" watchObservedRunningTime="2026-03-22 00:10:44.017802158 +0000 UTC m=+115.040103531" Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.029947 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:44 crc kubenswrapper[5116]: E0322 00:10:44.030377 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:44.530350787 +0000 UTC m=+115.552652160 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.043501 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-55dml" podStartSLOduration=7.043480763 podStartE2EDuration="7.043480763s" podCreationTimestamp="2026-03-22 00:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:44.041974806 +0000 UTC m=+115.064276179" watchObservedRunningTime="2026-03-22 00:10:44.043480763 +0000 UTC m=+115.065782136" Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.133350 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:44 crc kubenswrapper[5116]: E0322 00:10:44.134035 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:44.634017266 +0000 UTC m=+115.656318639 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.234857 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:44 crc kubenswrapper[5116]: E0322 00:10:44.234960 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:44.734936849 +0000 UTC m=+115.757238222 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.235100 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:44 crc kubenswrapper[5116]: E0322 00:10:44.235500 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:44.735490866 +0000 UTC m=+115.757792239 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.336472 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:44 crc kubenswrapper[5116]: E0322 00:10:44.336845 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:44.836828802 +0000 UTC m=+115.859130175 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.438392 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:44 crc kubenswrapper[5116]: E0322 00:10:44.438907 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:44.938893631 +0000 UTC m=+115.961195004 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.539244 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:44 crc kubenswrapper[5116]: E0322 00:10:44.539827 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:45.039809593 +0000 UTC m=+116.062110956 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.624596 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" event={"ID":"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1","Type":"ContainerStarted","Data":"cce1736020766020678b302f9485885375b60ee80f25f29a702ff1bc84c0c923"} Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.626118 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" event={"ID":"a224637e-e693-4ae7-89c3-1a01e6c9a6f5","Type":"ContainerStarted","Data":"e8d83bd6bf7d126be69f2a9272660c33e633ad82f3127df15394284067264add"} Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.628046 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" event={"ID":"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0","Type":"ContainerStarted","Data":"9b8ced341e1f46400da0f6c65b177eea973fdf22bf43e44e35a845682529abf7"} Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.628861 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" event={"ID":"50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee","Type":"ContainerStarted","Data":"f7d437beaa74078aa84a893bb3ee4d62bba1cafb5c44878654f173f2c29422ec"} Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.629623 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg" event={"ID":"0eea7869-af21-4009-856f-65219d64ceea","Type":"ContainerStarted","Data":"6d0104ada666e7e6c2f6c7d2cea8673dd2d3432041ed2f46677719d43d77ffbc"} Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.630397 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-m5rc6" event={"ID":"73928a0d-7a97-4c03-a5e8-6ab37119261c","Type":"ContainerStarted","Data":"6ca4c8f48df8cb1d6dbce19ee117d828a1c9d132209d18860317344245884603"} Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.631404 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-dfx6t" event={"ID":"3bf5ae18-6e08-436b-939f-03347eda68a8","Type":"ContainerStarted","Data":"4db8ccc541c061d081a9323a77409d3f74489ae183df9ab8fd0f2e159d724317"} Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.632765 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" event={"ID":"44b1188c-0fa6-48c7-bf76-6e65ca8174ec","Type":"ContainerStarted","Data":"9b10560cd7a82f92bfd106f54992b60372f8b0733f2557ed6d35133f54e4a049"} Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.633796 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-69db94689b-bkst6" event={"ID":"bd443947-7241-49e3-9d98-f55329818dcc","Type":"ContainerStarted","Data":"a989e336f64ff7cf1c58f5a42db0104202ed4b4d051c3f332e2c9e517781f9a7"} Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.634868 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" event={"ID":"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4","Type":"ContainerStarted","Data":"204cc0a6efcad763856a3cccc66e8b7f6eb8d5d7d836742b65a51dd8d5c44d68"} Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.636427 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" event={"ID":"c7b12ebb-b568-4d15-abde-14db5041d5d2","Type":"ContainerStarted","Data":"dbe0e471a0166b6234040c27693dc6ad179beb1af99bb2ed63999c35c7c819b2"} Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.637991 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-ms24k" event={"ID":"fc0602bb-5acd-426a-b3d6-3a2effb49bf3","Type":"ContainerStarted","Data":"e9a5c6a924e16844bc09c4b24e0eb4e75544e721253ee37e94fb371c303841c3"} Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.638982 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-799b87ffcd-42tp2" event={"ID":"65b5ebf3-054c-4827-96b2-7ea0a26f20af","Type":"ContainerStarted","Data":"fbe1374ce073ed0e1d47b2e44a86e0a084fb5df52a66da08f5c5d03dd03252af"} Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.640146 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rkswl" event={"ID":"0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a","Type":"ContainerStarted","Data":"9edd670c034ec20665286543b42cdc84695e9c720e6d68f0bf0b804ab6d6f251"} Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.640462 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:44 crc kubenswrapper[5116]: E0322 00:10:44.640869 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:45.140851359 +0000 UTC m=+116.163152732 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.741840 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:44 crc kubenswrapper[5116]: E0322 00:10:44.741998 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:45.241962848 +0000 UTC m=+116.264264221 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.742399 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:44 crc kubenswrapper[5116]: E0322 00:10:44.742837 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:45.242812494 +0000 UTC m=+116.265113887 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.758913 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.758963 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.760626 5116 patch_prober.go:28] interesting pod/apiserver-8596bd845d-f59q2 container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.760705 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" podUID="68cdf6e7-fccd-4375-9688-7a2bcbefd82f" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.843777 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:44 crc kubenswrapper[5116]: E0322 00:10:44.843938 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:45.343907592 +0000 UTC m=+116.366208965 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.844229 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:44 crc kubenswrapper[5116]: E0322 00:10:44.844563 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:45.344551043 +0000 UTC m=+116.366852416 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.937936 5116 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-2jlxw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 22 00:10:44 crc kubenswrapper[5116]: [-]has-synced failed: reason withheld Mar 22 00:10:44 crc kubenswrapper[5116]: [+]process-running ok Mar 22 00:10:44 crc kubenswrapper[5116]: healthz check failed Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.938006 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" podUID="00dffd10-d567-431f-8dd9-390443f26d96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.945122 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:44 crc kubenswrapper[5116]: E0322 00:10:44.945245 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:45.445225207 +0000 UTC m=+116.467526580 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.945654 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:44 crc kubenswrapper[5116]: E0322 00:10:44.945934 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:45.44592352 +0000 UTC m=+116.468224903 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.965122 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.965277 5116 patch_prober.go:28] interesting pod/olm-operator-5cdf44d969-cp4p2 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.965338 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" podUID="fe48c9c2-8783-475b-a961-d5a4110cb452" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.990707 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.046966 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:45 crc kubenswrapper[5116]: E0322 00:10:45.047399 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:45.547273636 +0000 UTC m=+116.569575009 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.056268 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:45 crc kubenswrapper[5116]: E0322 00:10:45.059863 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:45.559840654 +0000 UTC m=+116.582142257 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.158029 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:45 crc kubenswrapper[5116]: E0322 00:10:45.158464 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:45.658447753 +0000 UTC m=+116.680749126 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.188949 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-57nbs"] Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.259612 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:45 crc kubenswrapper[5116]: E0322 00:10:45.260021 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:45.760004036 +0000 UTC m=+116.782305409 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.267887 5116 ???:1] "http: TLS handshake error from 192.168.126.11:55520: no serving certificate available for the kubelet" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.357150 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-67c89758df-vnd4f" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.357856 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.358110 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-ms24k" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.358670 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.358701 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.360381 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:45 crc kubenswrapper[5116]: E0322 00:10:45.360535 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:45.860514195 +0000 UTC m=+116.882815568 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.361055 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.361278 5116 patch_prober.go:28] interesting pod/console-operator-67c89758df-vnd4f container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/readyz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.361327 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-67c89758df-vnd4f" podUID="b4117709-89bd-4e72-8016-0c25c0ece2c6" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/readyz\": dial tcp 10.217.0.28:8443: connect: connection refused" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.361407 5116 patch_prober.go:28] interesting pod/packageserver-7d4fc7d867-td5gr container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Mar 22 00:10:45 crc kubenswrapper[5116]: E0322 00:10:45.361428 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:45.861414863 +0000 UTC m=+116.883716236 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.361430 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" podUID="a224637e-e693-4ae7-89c3-1a01e6c9a6f5" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.361484 5116 patch_prober.go:28] interesting pod/marketplace-operator-547dbd544d-lf2zm container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.361510 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" podUID="23e39fb8-29b4-4a99-b189-3cd7c8e7f488" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.361537 5116 patch_prober.go:28] interesting pod/catalog-operator-75ff9f647d-47j6l container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.361572 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" podUID="a1258288-8146-4cba-9d66-2a88e35a1fe9" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.371675 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" podStartSLOduration=95.371657259 podStartE2EDuration="1m35.371657259s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:45.37106343 +0000 UTC m=+116.393364823" watchObservedRunningTime="2026-03-22 00:10:45.371657259 +0000 UTC m=+116.393958642" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.397018 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-ms24k" podStartSLOduration=95.396996313 podStartE2EDuration="1m35.396996313s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:45.391814777 +0000 UTC m=+116.414116150" watchObservedRunningTime="2026-03-22 00:10:45.396996313 +0000 UTC m=+116.419297686" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.444786 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" podStartSLOduration=96.444770499 podStartE2EDuration="1m36.444770499s" podCreationTimestamp="2026-03-22 00:09:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:45.428096159 +0000 UTC m=+116.450397542" watchObservedRunningTime="2026-03-22 00:10:45.444770499 +0000 UTC m=+116.467071872" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.462740 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:45 crc kubenswrapper[5116]: E0322 00:10:45.464658 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:45.964640348 +0000 UTC m=+116.986941721 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.475203 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" podStartSLOduration=95.475183274 podStartE2EDuration="1m35.475183274s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:45.446490133 +0000 UTC m=+116.468791506" watchObservedRunningTime="2026-03-22 00:10:45.475183274 +0000 UTC m=+116.497484657" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.476710 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" podStartSLOduration=95.476701031 podStartE2EDuration="1m35.476701031s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:45.474619306 +0000 UTC m=+116.496920699" watchObservedRunningTime="2026-03-22 00:10:45.476701031 +0000 UTC m=+116.499002424" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.565518 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:45 crc kubenswrapper[5116]: E0322 00:10:45.566019 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:46.066002975 +0000 UTC m=+117.088304348 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.649929 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl" event={"ID":"cf866f04-6739-40da-8c1c-36d192472220","Type":"ContainerStarted","Data":"98b80ed2f825f571051f8dfd1b02b4796223af2136881bbb7ab621d86c90d337"} Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.654944 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" event={"ID":"50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee","Type":"ContainerStarted","Data":"04df04b126b799a76ce4d5634b109696cac20a5d85878acb5c48cd2efa1e4072"} Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.658586 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg" event={"ID":"0eea7869-af21-4009-856f-65219d64ceea","Type":"ContainerStarted","Data":"a42679411685c6512bbfe6984a3a9a428152fe26ec7d0bfc12531ebbd29fbab8"} Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.661751 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4" event={"ID":"2512f5ef-a611-4637-b41f-41185def421b","Type":"ContainerStarted","Data":"80594bbc1ca27647c8954e15361d62b3ad3a7e1a7eec0475d5e013d18ce330ec"} Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.666521 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:45 crc kubenswrapper[5116]: E0322 00:10:45.666887 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:46.166871496 +0000 UTC m=+117.189172869 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.667908 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-dfx6t" podStartSLOduration=95.667883758 podStartE2EDuration="1m35.667883758s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:45.498073489 +0000 UTC m=+116.520374872" watchObservedRunningTime="2026-03-22 00:10:45.667883758 +0000 UTC m=+116.690185141" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.667984 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-m5rc6" event={"ID":"73928a0d-7a97-4c03-a5e8-6ab37119261c","Type":"ContainerStarted","Data":"ee91d3ca0e39b56fda13e403a733c1c343b35f8469fa149daa62e5e7bdf51948"} Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.668222 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl" podStartSLOduration=95.668215879 podStartE2EDuration="1m35.668215879s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:45.665902285 +0000 UTC m=+116.688203658" watchObservedRunningTime="2026-03-22 00:10:45.668215879 +0000 UTC m=+116.690517262" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.672409 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-69db94689b-bkst6" event={"ID":"bd443947-7241-49e3-9d98-f55329818dcc","Type":"ContainerStarted","Data":"0a1399231ec5c949259d7c311833b826b69a3a1d154f34f94c926c838d5b3c82"} Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.686757 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" podUID="f1d2a94f-b4d4-4cdc-b862-a4866cadaea1" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://a68815d0fdf2b53a6a0e9f938fc585a599e79f4f7ad52f8b29c63ba519d0d23e" gracePeriod=30 Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.687782 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rkswl" event={"ID":"0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a","Type":"ContainerStarted","Data":"6990c979394f4a46fe115e5e48d81ac6083326bf20dc3e26298eb49139789f17"} Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.687821 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-rkswl" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.689281 5116 patch_prober.go:28] interesting pod/olm-operator-5cdf44d969-cp4p2 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.689323 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" podUID="fe48c9c2-8783-475b-a961-d5a4110cb452" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.689814 5116 patch_prober.go:28] interesting pod/catalog-operator-75ff9f647d-47j6l container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.689874 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" podUID="a1258288-8146-4cba-9d66-2a88e35a1fe9" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.695463 5116 patch_prober.go:28] interesting pod/packageserver-7d4fc7d867-td5gr container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.695521 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" podUID="a224637e-e693-4ae7-89c3-1a01e6c9a6f5" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.695881 5116 patch_prober.go:28] interesting pod/console-operator-67c89758df-vnd4f container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/readyz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.695909 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-67c89758df-vnd4f" podUID="b4117709-89bd-4e72-8016-0c25c0ece2c6" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/readyz\": dial tcp 10.217.0.28:8443: connect: connection refused" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.696108 5116 patch_prober.go:28] interesting pod/marketplace-operator-547dbd544d-lf2zm container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.696233 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" podUID="23e39fb8-29b4-4a99-b189-3cd7c8e7f488" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.759269 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4" podStartSLOduration=95.759254017 podStartE2EDuration="1m35.759254017s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:45.727719976 +0000 UTC m=+116.750021349" watchObservedRunningTime="2026-03-22 00:10:45.759254017 +0000 UTC m=+116.781555390" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.763123 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" podStartSLOduration=95.76310565 podStartE2EDuration="1m35.76310565s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:45.757887964 +0000 UTC m=+116.780189347" watchObservedRunningTime="2026-03-22 00:10:45.76310565 +0000 UTC m=+116.785407013" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.769719 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:45 crc kubenswrapper[5116]: E0322 00:10:45.771714 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:46.271693482 +0000 UTC m=+117.293994945 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.787537 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg" podStartSLOduration=95.787514704 podStartE2EDuration="1m35.787514704s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:45.776952769 +0000 UTC m=+116.799254152" watchObservedRunningTime="2026-03-22 00:10:45.787514704 +0000 UTC m=+116.809816077" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.799661 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-rkswl" podStartSLOduration=8.799644699 podStartE2EDuration="8.799644699s" podCreationTimestamp="2026-03-22 00:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:45.799399901 +0000 UTC m=+116.821701284" watchObservedRunningTime="2026-03-22 00:10:45.799644699 +0000 UTC m=+116.821946072" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.815710 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" podStartSLOduration=95.815693958 podStartE2EDuration="1m35.815693958s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:45.814840321 +0000 UTC m=+116.837141704" watchObservedRunningTime="2026-03-22 00:10:45.815693958 +0000 UTC m=+116.837995331" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.840668 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-69db94689b-bkst6" podStartSLOduration=95.8406517 podStartE2EDuration="1m35.8406517s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:45.837526261 +0000 UTC m=+116.859827634" watchObservedRunningTime="2026-03-22 00:10:45.8406517 +0000 UTC m=+116.862953073" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.865464 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" podStartSLOduration=95.865439696 podStartE2EDuration="1m35.865439696s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:45.862237985 +0000 UTC m=+116.884539378" watchObservedRunningTime="2026-03-22 00:10:45.865439696 +0000 UTC m=+116.887741069" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.876639 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:45 crc kubenswrapper[5116]: E0322 00:10:45.877039 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:46.377018704 +0000 UTC m=+117.399320077 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.877182 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:45 crc kubenswrapper[5116]: E0322 00:10:45.877623 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:46.377613673 +0000 UTC m=+117.399915046 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.883494 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-m5rc6" podStartSLOduration=95.883477639 podStartE2EDuration="1m35.883477639s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:45.882910871 +0000 UTC m=+116.905212264" watchObservedRunningTime="2026-03-22 00:10:45.883477639 +0000 UTC m=+116.905779022" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.935609 5116 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-2jlxw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 22 00:10:45 crc kubenswrapper[5116]: [-]has-synced failed: reason withheld Mar 22 00:10:45 crc kubenswrapper[5116]: [+]process-running ok Mar 22 00:10:45 crc kubenswrapper[5116]: healthz check failed Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.935699 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" podUID="00dffd10-d567-431f-8dd9-390443f26d96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.978412 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:45 crc kubenswrapper[5116]: E0322 00:10:45.978588 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:46.478558136 +0000 UTC m=+117.500859509 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.978808 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:45 crc kubenswrapper[5116]: E0322 00:10:45.979137 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:46.479118024 +0000 UTC m=+117.501419397 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.079710 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:46 crc kubenswrapper[5116]: E0322 00:10:46.079858 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:46.57983615 +0000 UTC m=+117.602137523 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.079966 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:46 crc kubenswrapper[5116]: E0322 00:10:46.080315 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:46.580304805 +0000 UTC m=+117.602606178 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.181491 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:46 crc kubenswrapper[5116]: E0322 00:10:46.181733 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:46.681696072 +0000 UTC m=+117.703997445 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.182123 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:46 crc kubenswrapper[5116]: E0322 00:10:46.182692 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:46.682649582 +0000 UTC m=+117.704950985 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.284013 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:46 crc kubenswrapper[5116]: E0322 00:10:46.284262 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:46.784230326 +0000 UTC m=+117.806531699 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.284867 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:46 crc kubenswrapper[5116]: E0322 00:10:46.285220 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:46.785206206 +0000 UTC m=+117.807507579 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.357831 5116 patch_prober.go:28] interesting pod/oauth-openshift-66458b6674-8qfhd container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.27:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.357914 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" podUID="73ebea9b-fc7b-4d54-af53-f6f61e0fce97" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.27:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.385949 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:46 crc kubenswrapper[5116]: E0322 00:10:46.386361 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:46.886344635 +0000 UTC m=+117.908646008 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.488337 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:46 crc kubenswrapper[5116]: E0322 00:10:46.488747 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:46.988725045 +0000 UTC m=+118.011026478 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.590192 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:46 crc kubenswrapper[5116]: E0322 00:10:46.590364 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:47.090331248 +0000 UTC m=+118.112632621 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.590587 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:46 crc kubenswrapper[5116]: E0322 00:10:46.590903 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:47.090890766 +0000 UTC m=+118.113192139 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.691478 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:46 crc kubenswrapper[5116]: E0322 00:10:46.691719 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:47.191677584 +0000 UTC m=+118.213978957 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.692527 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:46 crc kubenswrapper[5116]: E0322 00:10:46.692924 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:47.192915674 +0000 UTC m=+118.215217047 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.695200 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-799b87ffcd-42tp2" event={"ID":"65b5ebf3-054c-4827-96b2-7ea0a26f20af","Type":"ContainerStarted","Data":"cb8e992c97289e8677dffbf9ed31d0c2a9abf46b9937c4c28ac09206f90119aa"} Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.697210 5116 patch_prober.go:28] interesting pod/packageserver-7d4fc7d867-td5gr container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.697262 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" podUID="a224637e-e693-4ae7-89c3-1a01e6c9a6f5" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.724710 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-799b87ffcd-42tp2" podStartSLOduration=96.724689471 podStartE2EDuration="1m36.724689471s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:46.72244969 +0000 UTC m=+117.744751073" watchObservedRunningTime="2026-03-22 00:10:46.724689471 +0000 UTC m=+117.746990854" Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.793795 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:46 crc kubenswrapper[5116]: E0322 00:10:46.794031 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:47.293995081 +0000 UTC m=+118.316296474 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.794728 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:46 crc kubenswrapper[5116]: E0322 00:10:46.796383 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:47.296368416 +0000 UTC m=+118.318669789 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.895798 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:46 crc kubenswrapper[5116]: E0322 00:10:46.896206 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:47.396188163 +0000 UTC m=+118.418489536 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.941705 5116 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-2jlxw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 22 00:10:46 crc kubenswrapper[5116]: [-]has-synced failed: reason withheld Mar 22 00:10:46 crc kubenswrapper[5116]: [+]process-running ok Mar 22 00:10:46 crc kubenswrapper[5116]: healthz check failed Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.941774 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" podUID="00dffd10-d567-431f-8dd9-390443f26d96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.997675 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:46 crc kubenswrapper[5116]: E0322 00:10:46.997974 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:47.497961773 +0000 UTC m=+118.520263146 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.099222 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:47 crc kubenswrapper[5116]: E0322 00:10:47.099526 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:47.599508175 +0000 UTC m=+118.621809558 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.201485 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:47 crc kubenswrapper[5116]: E0322 00:10:47.201921 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:47.701900854 +0000 UTC m=+118.724202227 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.302931 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:47 crc kubenswrapper[5116]: E0322 00:10:47.303444 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:47.803428106 +0000 UTC m=+118.825729469 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.404962 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:47 crc kubenswrapper[5116]: E0322 00:10:47.405375 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:47.90536214 +0000 UTC m=+118.927663513 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.506589 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:47 crc kubenswrapper[5116]: E0322 00:10:47.506985 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:48.006967995 +0000 UTC m=+119.029269368 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.608250 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:47 crc kubenswrapper[5116]: E0322 00:10:47.608614 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:48.10859476 +0000 UTC m=+119.130896133 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.613452 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/revision-pruner-6-crc"] Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.710144 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:47 crc kubenswrapper[5116]: E0322 00:10:47.710570 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:48.210545814 +0000 UTC m=+119.232847207 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.812005 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:47 crc kubenswrapper[5116]: E0322 00:10:47.812318 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:48.312304613 +0000 UTC m=+119.334605986 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.855410 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-6-crc" Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.859859 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-scheduler\"/\"installer-sa-dockercfg-qpkss\"" Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.862158 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mzck5" event={"ID":"1bb9e03f-ef85-4dbe-802f-d529e97b092c","Type":"ContainerStarted","Data":"14cfa7846d0313cc9092a8f3bb677b9d20ea75addf2ca3bd62b03f25f030db5b"} Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.862209 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/revision-pruner-6-crc"] Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.862713 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-scheduler\"/\"kube-root-ca.crt\"" Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.882051 5116 ???:1] "http: TLS handshake error from 192.168.126.11:48434: no serving certificate available for the kubelet" Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.913892 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:47 crc kubenswrapper[5116]: E0322 00:10:47.913989 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:48.413964989 +0000 UTC m=+119.436266362 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.914079 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.914116 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06481163-cdc1-4b43-b6c2-73f2672feb42-kube-api-access\") pod \"revision-pruner-6-crc\" (UID: \"06481163-cdc1-4b43-b6c2-73f2672feb42\") " pod="openshift-kube-scheduler/revision-pruner-6-crc" Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.914428 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/06481163-cdc1-4b43-b6c2-73f2672feb42-kubelet-dir\") pod \"revision-pruner-6-crc\" (UID: \"06481163-cdc1-4b43-b6c2-73f2672feb42\") " pod="openshift-kube-scheduler/revision-pruner-6-crc" Mar 22 00:10:47 crc kubenswrapper[5116]: E0322 00:10:47.914468 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:48.414459924 +0000 UTC m=+119.436761297 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.939544 5116 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-2jlxw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 22 00:10:47 crc kubenswrapper[5116]: [-]has-synced failed: reason withheld Mar 22 00:10:47 crc kubenswrapper[5116]: [+]process-running ok Mar 22 00:10:47 crc kubenswrapper[5116]: healthz check failed Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.939631 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" podUID="00dffd10-d567-431f-8dd9-390443f26d96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.015422 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:48 crc kubenswrapper[5116]: E0322 00:10:48.015653 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:48.515617054 +0000 UTC m=+119.537918427 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.015795 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/06481163-cdc1-4b43-b6c2-73f2672feb42-kubelet-dir\") pod \"revision-pruner-6-crc\" (UID: \"06481163-cdc1-4b43-b6c2-73f2672feb42\") " pod="openshift-kube-scheduler/revision-pruner-6-crc" Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.015949 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/06481163-cdc1-4b43-b6c2-73f2672feb42-kubelet-dir\") pod \"revision-pruner-6-crc\" (UID: \"06481163-cdc1-4b43-b6c2-73f2672feb42\") " pod="openshift-kube-scheduler/revision-pruner-6-crc" Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.015961 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.016069 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06481163-cdc1-4b43-b6c2-73f2672feb42-kube-api-access\") pod \"revision-pruner-6-crc\" (UID: \"06481163-cdc1-4b43-b6c2-73f2672feb42\") " pod="openshift-kube-scheduler/revision-pruner-6-crc" Mar 22 00:10:48 crc kubenswrapper[5116]: E0322 00:10:48.016265 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:48.516256725 +0000 UTC m=+119.538558168 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.073050 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06481163-cdc1-4b43-b6c2-73f2672feb42-kube-api-access\") pod \"revision-pruner-6-crc\" (UID: \"06481163-cdc1-4b43-b6c2-73f2672feb42\") " pod="openshift-kube-scheduler/revision-pruner-6-crc" Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.117883 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:48 crc kubenswrapper[5116]: E0322 00:10:48.118405 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:48.618386666 +0000 UTC m=+119.640688039 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.190202 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-6-crc" Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.219302 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:48 crc kubenswrapper[5116]: E0322 00:10:48.219720 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:48.7197075 +0000 UTC m=+119.742008873 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.320460 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:48 crc kubenswrapper[5116]: E0322 00:10:48.320776 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:48.820759357 +0000 UTC m=+119.843060730 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.424148 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:48 crc kubenswrapper[5116]: E0322 00:10:48.425029 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:48.925013505 +0000 UTC m=+119.947314878 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.532402 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:48 crc kubenswrapper[5116]: E0322 00:10:48.532778 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:49.032761285 +0000 UTC m=+120.055062658 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.633817 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:48 crc kubenswrapper[5116]: E0322 00:10:48.634127 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:49.13411498 +0000 UTC m=+120.156416353 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.640871 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/revision-pruner-6-crc"] Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.690548 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.692259 5116 patch_prober.go:28] interesting pod/openshift-config-operator-5777786469-wb6r8 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.692310 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" podUID="44b1188c-0fa6-48c7-bf76-6e65ca8174ec" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.712450 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-6-crc" event={"ID":"06481163-cdc1-4b43-b6c2-73f2672feb42","Type":"ContainerStarted","Data":"5fd306e69fede60242c9ecd2db58c2c9e041c89fdb800077d2c6965a559bbb92"} Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.734549 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:48 crc kubenswrapper[5116]: E0322 00:10:48.734946 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:49.234927219 +0000 UTC m=+120.257228592 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.836266 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:48 crc kubenswrapper[5116]: E0322 00:10:48.836804 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:49.336789972 +0000 UTC m=+120.359091345 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.852052 5116 patch_prober.go:28] interesting pod/openshift-config-operator-5777786469-wb6r8 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.852117 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" podUID="44b1188c-0fa6-48c7-bf76-6e65ca8174ec" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.852378 5116 scope.go:117] "RemoveContainer" containerID="4ec1f0e4053fa1e136a94ad86e588cc0fd43b29333734b120fe3d6175c1913a8" Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.935748 5116 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-2jlxw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 22 00:10:48 crc kubenswrapper[5116]: [-]has-synced failed: reason withheld Mar 22 00:10:48 crc kubenswrapper[5116]: [+]process-running ok Mar 22 00:10:48 crc kubenswrapper[5116]: healthz check failed Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.935971 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" podUID="00dffd10-d567-431f-8dd9-390443f26d96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.937630 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:48 crc kubenswrapper[5116]: E0322 00:10:48.937794 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:49.437773676 +0000 UTC m=+120.460075049 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.938008 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:48 crc kubenswrapper[5116]: E0322 00:10:48.938358 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:49.438345534 +0000 UTC m=+120.460646907 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.039319 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:49 crc kubenswrapper[5116]: E0322 00:10:49.039678 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:49.539662208 +0000 UTC m=+120.561963581 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.140835 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:49 crc kubenswrapper[5116]: E0322 00:10:49.141300 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:49.641279603 +0000 UTC m=+120.663580976 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.241906 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:49 crc kubenswrapper[5116]: E0322 00:10:49.242158 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:49.742124683 +0000 UTC m=+120.764426056 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.280781 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t4x6l"] Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.343982 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:49 crc kubenswrapper[5116]: E0322 00:10:49.344352 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:49.844337607 +0000 UTC m=+120.866638980 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.367258 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t4x6l"] Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.367409 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t4x6l" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.379648 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"certified-operators-dockercfg-7cl8d\"" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.434783 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.434846 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.445127 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:49 crc kubenswrapper[5116]: E0322 00:10:49.445321 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:49.945296231 +0000 UTC m=+120.967597604 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.445697 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da3b0eb3-e48f-4080-bfdc-522f18cf2876-catalog-content\") pod \"certified-operators-t4x6l\" (UID: \"da3b0eb3-e48f-4080-bfdc-522f18cf2876\") " pod="openshift-marketplace/certified-operators-t4x6l" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.445738 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh2v2\" (UniqueName: \"kubernetes.io/projected/da3b0eb3-e48f-4080-bfdc-522f18cf2876-kube-api-access-zh2v2\") pod \"certified-operators-t4x6l\" (UID: \"da3b0eb3-e48f-4080-bfdc-522f18cf2876\") " pod="openshift-marketplace/certified-operators-t4x6l" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.445768 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.445954 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da3b0eb3-e48f-4080-bfdc-522f18cf2876-utilities\") pod \"certified-operators-t4x6l\" (UID: \"da3b0eb3-e48f-4080-bfdc-522f18cf2876\") " pod="openshift-marketplace/certified-operators-t4x6l" Mar 22 00:10:49 crc kubenswrapper[5116]: E0322 00:10:49.446243 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:49.946151217 +0000 UTC m=+120.968452640 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.468907 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zrcmf"] Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.475308 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zrcmf" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.478499 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"community-operators-dockercfg-vrd5f\"" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.487639 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zrcmf"] Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.546970 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:49 crc kubenswrapper[5116]: E0322 00:10:49.547124 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:50.0471008 +0000 UTC m=+121.069402173 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.547374 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cpmn\" (UniqueName: \"kubernetes.io/projected/77380b82-4c44-4cfd-a7b1-e77b060af507-kube-api-access-6cpmn\") pod \"community-operators-zrcmf\" (UID: \"77380b82-4c44-4cfd-a7b1-e77b060af507\") " pod="openshift-marketplace/community-operators-zrcmf" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.547471 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da3b0eb3-e48f-4080-bfdc-522f18cf2876-utilities\") pod \"certified-operators-t4x6l\" (UID: \"da3b0eb3-e48f-4080-bfdc-522f18cf2876\") " pod="openshift-marketplace/certified-operators-t4x6l" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.547813 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77380b82-4c44-4cfd-a7b1-e77b060af507-utilities\") pod \"community-operators-zrcmf\" (UID: \"77380b82-4c44-4cfd-a7b1-e77b060af507\") " pod="openshift-marketplace/community-operators-zrcmf" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.547847 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77380b82-4c44-4cfd-a7b1-e77b060af507-catalog-content\") pod \"community-operators-zrcmf\" (UID: \"77380b82-4c44-4cfd-a7b1-e77b060af507\") " pod="openshift-marketplace/community-operators-zrcmf" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.547880 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da3b0eb3-e48f-4080-bfdc-522f18cf2876-catalog-content\") pod \"certified-operators-t4x6l\" (UID: \"da3b0eb3-e48f-4080-bfdc-522f18cf2876\") " pod="openshift-marketplace/certified-operators-t4x6l" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.547896 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da3b0eb3-e48f-4080-bfdc-522f18cf2876-utilities\") pod \"certified-operators-t4x6l\" (UID: \"da3b0eb3-e48f-4080-bfdc-522f18cf2876\") " pod="openshift-marketplace/certified-operators-t4x6l" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.547953 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zh2v2\" (UniqueName: \"kubernetes.io/projected/da3b0eb3-e48f-4080-bfdc-522f18cf2876-kube-api-access-zh2v2\") pod \"certified-operators-t4x6l\" (UID: \"da3b0eb3-e48f-4080-bfdc-522f18cf2876\") " pod="openshift-marketplace/certified-operators-t4x6l" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.547997 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:49 crc kubenswrapper[5116]: E0322 00:10:49.548486 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:50.048467574 +0000 UTC m=+121.070768947 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.548482 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da3b0eb3-e48f-4080-bfdc-522f18cf2876-catalog-content\") pod \"certified-operators-t4x6l\" (UID: \"da3b0eb3-e48f-4080-bfdc-522f18cf2876\") " pod="openshift-marketplace/certified-operators-t4x6l" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.556936 5116 patch_prober.go:28] interesting pod/downloads-747b44746d-cb5p2 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.556997 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-747b44746d-cb5p2" podUID="45fa64e1-27bb-4b1f-bf62-4fa08b5dcfa0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.590607 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh2v2\" (UniqueName: \"kubernetes.io/projected/da3b0eb3-e48f-4080-bfdc-522f18cf2876-kube-api-access-zh2v2\") pod \"certified-operators-t4x6l\" (UID: \"da3b0eb3-e48f-4080-bfdc-522f18cf2876\") " pod="openshift-marketplace/certified-operators-t4x6l" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.649288 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:49 crc kubenswrapper[5116]: E0322 00:10:49.649472 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:50.149442268 +0000 UTC m=+121.171743641 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.649908 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6cpmn\" (UniqueName: \"kubernetes.io/projected/77380b82-4c44-4cfd-a7b1-e77b060af507-kube-api-access-6cpmn\") pod \"community-operators-zrcmf\" (UID: \"77380b82-4c44-4cfd-a7b1-e77b060af507\") " pod="openshift-marketplace/community-operators-zrcmf" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.650129 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77380b82-4c44-4cfd-a7b1-e77b060af507-utilities\") pod \"community-operators-zrcmf\" (UID: \"77380b82-4c44-4cfd-a7b1-e77b060af507\") " pod="openshift-marketplace/community-operators-zrcmf" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.650161 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77380b82-4c44-4cfd-a7b1-e77b060af507-catalog-content\") pod \"community-operators-zrcmf\" (UID: \"77380b82-4c44-4cfd-a7b1-e77b060af507\") " pod="openshift-marketplace/community-operators-zrcmf" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.650230 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:49 crc kubenswrapper[5116]: E0322 00:10:49.650526 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:50.150517452 +0000 UTC m=+121.172818825 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.650656 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77380b82-4c44-4cfd-a7b1-e77b060af507-utilities\") pod \"community-operators-zrcmf\" (UID: \"77380b82-4c44-4cfd-a7b1-e77b060af507\") " pod="openshift-marketplace/community-operators-zrcmf" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.650664 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77380b82-4c44-4cfd-a7b1-e77b060af507-catalog-content\") pod \"community-operators-zrcmf\" (UID: \"77380b82-4c44-4cfd-a7b1-e77b060af507\") " pod="openshift-marketplace/community-operators-zrcmf" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.679405 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cpmn\" (UniqueName: \"kubernetes.io/projected/77380b82-4c44-4cfd-a7b1-e77b060af507-kube-api-access-6cpmn\") pod \"community-operators-zrcmf\" (UID: \"77380b82-4c44-4cfd-a7b1-e77b060af507\") " pod="openshift-marketplace/community-operators-zrcmf" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.682138 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hppqv"] Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.685444 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"certified-operators-dockercfg-7cl8d\"" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.692533 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t4x6l" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.757010 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:49 crc kubenswrapper[5116]: E0322 00:10:49.757575 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:50.257543978 +0000 UTC m=+121.279845351 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.792057 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/3.log" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.805816 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"community-operators-dockercfg-vrd5f\"" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.806434 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zrcmf" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.819822 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hppqv"] Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.819877 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-6-crc" event={"ID":"06481163-cdc1-4b43-b6c2-73f2672feb42","Type":"ContainerStarted","Data":"5eba216ccdb085e6ee94d8295dcb56560aa662e5daa7c913adf5cc47ec899e24"} Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.819985 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.821185 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hppqv" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.847364 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.847400 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"dcfc5090822842b425def56d8dcf3225bb8000eb1ec79e8329dcadd2f3879a0f"} Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.848117 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.859414 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-catalog-content\") pod \"certified-operators-hppqv\" (UID: \"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70\") " pod="openshift-marketplace/certified-operators-hppqv" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.859565 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4dfx\" (UniqueName: \"kubernetes.io/projected/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-kube-api-access-c4dfx\") pod \"certified-operators-hppqv\" (UID: \"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70\") " pod="openshift-marketplace/certified-operators-hppqv" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.859621 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.859657 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-utilities\") pod \"certified-operators-hppqv\" (UID: \"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70\") " pod="openshift-marketplace/certified-operators-hppqv" Mar 22 00:10:49 crc kubenswrapper[5116]: E0322 00:10:49.861147 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:50.361132675 +0000 UTC m=+121.383434048 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.910263 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-npbn6"] Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.939208 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-npbn6" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.941311 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-npbn6"] Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.951656 5116 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-2jlxw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 22 00:10:49 crc kubenswrapper[5116]: [-]has-synced failed: reason withheld Mar 22 00:10:49 crc kubenswrapper[5116]: [+]process-running ok Mar 22 00:10:49 crc kubenswrapper[5116]: healthz check failed Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.951759 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" podUID="00dffd10-d567-431f-8dd9-390443f26d96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.961837 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.962010 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-catalog-content\") pod \"certified-operators-hppqv\" (UID: \"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70\") " pod="openshift-marketplace/certified-operators-hppqv" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.962071 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c4dfx\" (UniqueName: \"kubernetes.io/projected/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-kube-api-access-c4dfx\") pod \"certified-operators-hppqv\" (UID: \"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70\") " pod="openshift-marketplace/certified-operators-hppqv" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.962118 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-utilities\") pod \"certified-operators-hppqv\" (UID: \"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70\") " pod="openshift-marketplace/certified-operators-hppqv" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.962885 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-utilities\") pod \"certified-operators-hppqv\" (UID: \"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70\") " pod="openshift-marketplace/certified-operators-hppqv" Mar 22 00:10:49 crc kubenswrapper[5116]: E0322 00:10:49.962963 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:50.462934216 +0000 UTC m=+121.485235579 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.963199 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-catalog-content\") pod \"certified-operators-hppqv\" (UID: \"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70\") " pod="openshift-marketplace/certified-operators-hppqv" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.022892 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4dfx\" (UniqueName: \"kubernetes.io/projected/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-kube-api-access-c4dfx\") pod \"certified-operators-hppqv\" (UID: \"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70\") " pod="openshift-marketplace/certified-operators-hppqv" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.052944 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/revision-pruner-6-crc" podStartSLOduration=3.052924161 podStartE2EDuration="3.052924161s" podCreationTimestamp="2026-03-22 00:10:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:49.993213766 +0000 UTC m=+121.015515149" watchObservedRunningTime="2026-03-22 00:10:50.052924161 +0000 UTC m=+121.075225534" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.065628 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.065687 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-utilities\") pod \"community-operators-npbn6\" (UID: \"09fbdb0d-3da3-4d36-9a96-4ed0caa53799\") " pod="openshift-marketplace/community-operators-npbn6" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.065713 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-catalog-content\") pod \"community-operators-npbn6\" (UID: \"09fbdb0d-3da3-4d36-9a96-4ed0caa53799\") " pod="openshift-marketplace/community-operators-npbn6" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.065796 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbjjc\" (UniqueName: \"kubernetes.io/projected/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-kube-api-access-dbjjc\") pod \"community-operators-npbn6\" (UID: \"09fbdb0d-3da3-4d36-9a96-4ed0caa53799\") " pod="openshift-marketplace/community-operators-npbn6" Mar 22 00:10:50 crc kubenswrapper[5116]: E0322 00:10:50.066065 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:50.566053408 +0000 UTC m=+121.588354781 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.159022 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hppqv" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.166370 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.166519 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-catalog-content\") pod \"community-operators-npbn6\" (UID: \"09fbdb0d-3da3-4d36-9a96-4ed0caa53799\") " pod="openshift-marketplace/community-operators-npbn6" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.166608 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbjjc\" (UniqueName: \"kubernetes.io/projected/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-kube-api-access-dbjjc\") pod \"community-operators-npbn6\" (UID: \"09fbdb0d-3da3-4d36-9a96-4ed0caa53799\") " pod="openshift-marketplace/community-operators-npbn6" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.166663 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-utilities\") pod \"community-operators-npbn6\" (UID: \"09fbdb0d-3da3-4d36-9a96-4ed0caa53799\") " pod="openshift-marketplace/community-operators-npbn6" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.167043 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-utilities\") pod \"community-operators-npbn6\" (UID: \"09fbdb0d-3da3-4d36-9a96-4ed0caa53799\") " pod="openshift-marketplace/community-operators-npbn6" Mar 22 00:10:50 crc kubenswrapper[5116]: E0322 00:10:50.167124 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:50.667109484 +0000 UTC m=+121.689410857 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.167365 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-catalog-content\") pod \"community-operators-npbn6\" (UID: \"09fbdb0d-3da3-4d36-9a96-4ed0caa53799\") " pod="openshift-marketplace/community-operators-npbn6" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.225193 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=28.225177067 podStartE2EDuration="28.225177067s" podCreationTimestamp="2026-03-22 00:10:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:50.223488963 +0000 UTC m=+121.245790346" watchObservedRunningTime="2026-03-22 00:10:50.225177067 +0000 UTC m=+121.247478440" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.238011 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbjjc\" (UniqueName: \"kubernetes.io/projected/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-kube-api-access-dbjjc\") pod \"community-operators-npbn6\" (UID: \"09fbdb0d-3da3-4d36-9a96-4ed0caa53799\") " pod="openshift-marketplace/community-operators-npbn6" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.251261 5116 patch_prober.go:28] interesting pod/openshift-config-operator-5777786469-wb6r8 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": context deadline exceeded" start-of-body= Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.251336 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" podUID="44b1188c-0fa6-48c7-bf76-6e65ca8174ec" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": context deadline exceeded" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.268907 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:50 crc kubenswrapper[5116]: E0322 00:10:50.269287 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:50.769269586 +0000 UTC m=+121.791570959 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.277955 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-npbn6" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.339891 5116 patch_prober.go:28] interesting pod/apiserver-9ddfb9f55-m5dds container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 22 00:10:50 crc kubenswrapper[5116]: [+]log ok Mar 22 00:10:50 crc kubenswrapper[5116]: [+]etcd ok Mar 22 00:10:50 crc kubenswrapper[5116]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 22 00:10:50 crc kubenswrapper[5116]: [+]poststarthook/generic-apiserver-start-informers ok Mar 22 00:10:50 crc kubenswrapper[5116]: [+]poststarthook/max-in-flight-filter ok Mar 22 00:10:50 crc kubenswrapper[5116]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 22 00:10:50 crc kubenswrapper[5116]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 22 00:10:50 crc kubenswrapper[5116]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 22 00:10:50 crc kubenswrapper[5116]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 22 00:10:50 crc kubenswrapper[5116]: [+]poststarthook/project.openshift.io-projectcache ok Mar 22 00:10:50 crc kubenswrapper[5116]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 22 00:10:50 crc kubenswrapper[5116]: [+]poststarthook/openshift.io-startinformers ok Mar 22 00:10:50 crc kubenswrapper[5116]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 22 00:10:50 crc kubenswrapper[5116]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 22 00:10:50 crc kubenswrapper[5116]: livez check failed Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.339974 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" podUID="8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.369937 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:50 crc kubenswrapper[5116]: E0322 00:10:50.370529 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:50.870509648 +0000 UTC m=+121.892811021 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.406596 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zrcmf"] Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.461656 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.461688 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.478089 5116 patch_prober.go:28] interesting pod/console-64d44f6ddf-9g5sg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.478154 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-64d44f6ddf-9g5sg" podUID="4c2755ce-817d-47b0-9f19-7218641d0c5b" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.479236 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:50 crc kubenswrapper[5116]: E0322 00:10:50.479701 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:50.979685722 +0000 UTC m=+122.001987095 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.543278 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t4x6l"] Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.585031 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:50 crc kubenswrapper[5116]: E0322 00:10:50.586649 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:51.086626766 +0000 UTC m=+122.108928139 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.687040 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:50 crc kubenswrapper[5116]: E0322 00:10:50.687466 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:51.187421324 +0000 UTC m=+122.209722697 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.735221 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hppqv"] Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.788594 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:50 crc kubenswrapper[5116]: E0322 00:10:50.788948 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:51.288928876 +0000 UTC m=+122.311230259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.847265 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hppqv" event={"ID":"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70","Type":"ContainerStarted","Data":"58966d6ce1821ab01be1615acae0ea7c0ed73c7caaf2ca09de37bd0dfaf8db3c"} Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.848227 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrcmf" event={"ID":"77380b82-4c44-4cfd-a7b1-e77b060af507","Type":"ContainerStarted","Data":"c8dbd89a41371e9d08f38390365ebbf2b2a5481a8e6093a86e6911bc41519ed3"} Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.849391 5116 generic.go:358] "Generic (PLEG): container finished" podID="06481163-cdc1-4b43-b6c2-73f2672feb42" containerID="5eba216ccdb085e6ee94d8295dcb56560aa662e5daa7c913adf5cc47ec899e24" exitCode=0 Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.849442 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-6-crc" event={"ID":"06481163-cdc1-4b43-b6c2-73f2672feb42","Type":"ContainerDied","Data":"5eba216ccdb085e6ee94d8295dcb56560aa662e5daa7c913adf5cc47ec899e24"} Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.871473 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4x6l" event={"ID":"da3b0eb3-e48f-4080-bfdc-522f18cf2876","Type":"ContainerStarted","Data":"96e17397bb6bb8c0e2ce3437a4637533d85e31f9659bc8478852a483b13dd5dd"} Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.890036 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:50 crc kubenswrapper[5116]: E0322 00:10:50.890399 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:51.390371384 +0000 UTC m=+122.412672757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.932226 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.935026 5116 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-2jlxw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 22 00:10:50 crc kubenswrapper[5116]: [-]has-synced failed: reason withheld Mar 22 00:10:50 crc kubenswrapper[5116]: [+]process-running ok Mar 22 00:10:50 crc kubenswrapper[5116]: healthz check failed Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.935107 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" podUID="00dffd10-d567-431f-8dd9-390443f26d96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.990889 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:50 crc kubenswrapper[5116]: E0322 00:10:50.991047 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:51.491021868 +0000 UTC m=+122.513323241 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.991436 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:50 crc kubenswrapper[5116]: E0322 00:10:50.992780 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:51.492762453 +0000 UTC m=+122.515063936 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.992881 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-npbn6"] Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.092720 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:51 crc kubenswrapper[5116]: E0322 00:10:51.093094 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:51.593079306 +0000 UTC m=+122.615380679 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.194324 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:51 crc kubenswrapper[5116]: E0322 00:10:51.194706 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:51.694687461 +0000 UTC m=+122.716988834 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.295855 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:51 crc kubenswrapper[5116]: E0322 00:10:51.296053 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:51.796024746 +0000 UTC m=+122.818326119 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.296731 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:51 crc kubenswrapper[5116]: E0322 00:10:51.297088 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:51.797072059 +0000 UTC m=+122.819373432 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.398502 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:51 crc kubenswrapper[5116]: E0322 00:10:51.398688 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:51.898649693 +0000 UTC m=+122.920951066 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.398996 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:51 crc kubenswrapper[5116]: E0322 00:10:51.399311 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:51.899301464 +0000 UTC m=+122.921602837 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.467581 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kp7rb"] Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.500191 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:51 crc kubenswrapper[5116]: E0322 00:10:51.500763 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:52.000743672 +0000 UTC m=+123.023045045 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.601988 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:51 crc kubenswrapper[5116]: E0322 00:10:51.602359 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:52.102343456 +0000 UTC m=+123.124644829 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.702762 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:51 crc kubenswrapper[5116]: E0322 00:10:51.703184 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:52.203153545 +0000 UTC m=+123.225454918 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.804151 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:51 crc kubenswrapper[5116]: E0322 00:10:51.804589 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:52.304573853 +0000 UTC m=+123.326875236 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.876373 5116 generic.go:358] "Generic (PLEG): container finished" podID="da3b0eb3-e48f-4080-bfdc-522f18cf2876" containerID="48d478e51d471e4228cfd3bf987de6d6edc345926b49b645231a761cba1fbba7" exitCode=0 Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.879200 5116 generic.go:358] "Generic (PLEG): container finished" podID="1c6b741d-fb0d-4bb1-a050-a2e56bd11e70" containerID="4ef7e2e5385256cc0a82e70e80e8f709c236aa9f928adb0c844bf2af996e6791" exitCode=0 Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.880850 5116 generic.go:358] "Generic (PLEG): container finished" podID="77380b82-4c44-4cfd-a7b1-e77b060af507" containerID="650e2199c9bfdd8d8095c16f7e86df5e23a5cedd710ebf3a93a1b3818c4e5743" exitCode=0 Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.905769 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:51 crc kubenswrapper[5116]: E0322 00:10:51.905940 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:52.405913529 +0000 UTC m=+123.428214902 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.906158 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:51 crc kubenswrapper[5116]: E0322 00:10:51.906541 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:52.406523507 +0000 UTC m=+123.428824960 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.936754 5116 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-2jlxw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 22 00:10:51 crc kubenswrapper[5116]: [-]has-synced failed: reason withheld Mar 22 00:10:51 crc kubenswrapper[5116]: [+]process-running ok Mar 22 00:10:51 crc kubenswrapper[5116]: healthz check failed Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.936823 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" podUID="00dffd10-d567-431f-8dd9-390443f26d96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.008665 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:52 crc kubenswrapper[5116]: E0322 00:10:52.009078 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:52.509061362 +0000 UTC m=+123.531362735 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.110556 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:52 crc kubenswrapper[5116]: E0322 00:10:52.111009 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:52.610989886 +0000 UTC m=+123.633291259 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.211541 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:52 crc kubenswrapper[5116]: E0322 00:10:52.211717 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:52.711684101 +0000 UTC m=+123.733985484 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.212044 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:52 crc kubenswrapper[5116]: E0322 00:10:52.212424 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:52.712410644 +0000 UTC m=+123.734712017 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.313923 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:52 crc kubenswrapper[5116]: E0322 00:10:52.314239 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:52.814206784 +0000 UTC m=+123.836508167 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.314874 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.315033 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4x6l" event={"ID":"da3b0eb3-e48f-4080-bfdc-522f18cf2876","Type":"ContainerDied","Data":"48d478e51d471e4228cfd3bf987de6d6edc345926b49b645231a761cba1fbba7"} Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.315214 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kp7rb"] Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.315266 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-npbn6" event={"ID":"09fbdb0d-3da3-4d36-9a96-4ed0caa53799","Type":"ContainerStarted","Data":"45e4dcc16acc5b874128c4be6959ee16a39761f0d8b86acfc6b789f6f799c066"} Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.315290 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hppqv" event={"ID":"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70","Type":"ContainerDied","Data":"4ef7e2e5385256cc0a82e70e80e8f709c236aa9f928adb0c844bf2af996e6791"} Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.315358 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.315415 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kp7rb" Mar 22 00:10:52 crc kubenswrapper[5116]: E0322 00:10:52.319652 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:52.819635527 +0000 UTC m=+123.841936900 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.321585 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"redhat-marketplace-dockercfg-gg4w7\"" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.328467 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrcmf" event={"ID":"77380b82-4c44-4cfd-a7b1-e77b060af507","Type":"ContainerDied","Data":"650e2199c9bfdd8d8095c16f7e86df5e23a5cedd710ebf3a93a1b3818c4e5743"} Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.328527 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wlccm"] Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.416588 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.416862 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/696eed68-bf2d-4bbd-865f-07998d61f8ab-utilities\") pod \"redhat-marketplace-kp7rb\" (UID: \"696eed68-bf2d-4bbd-865f-07998d61f8ab\") " pod="openshift-marketplace/redhat-marketplace-kp7rb" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.416931 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/696eed68-bf2d-4bbd-865f-07998d61f8ab-catalog-content\") pod \"redhat-marketplace-kp7rb\" (UID: \"696eed68-bf2d-4bbd-865f-07998d61f8ab\") " pod="openshift-marketplace/redhat-marketplace-kp7rb" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.416963 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwwx5\" (UniqueName: \"kubernetes.io/projected/696eed68-bf2d-4bbd-865f-07998d61f8ab-kube-api-access-vwwx5\") pod \"redhat-marketplace-kp7rb\" (UID: \"696eed68-bf2d-4bbd-865f-07998d61f8ab\") " pod="openshift-marketplace/redhat-marketplace-kp7rb" Mar 22 00:10:52 crc kubenswrapper[5116]: E0322 00:10:52.417974 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:52.917953826 +0000 UTC m=+123.940255209 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.459656 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wlccm"] Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.459854 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wlccm" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.473358 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wss9d"] Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.497721 5116 patch_prober.go:28] interesting pod/downloads-747b44746d-cb5p2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.497849 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-747b44746d-cb5p2" podUID="45fa64e1-27bb-4b1f-bf62-4fa08b5dcfa0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.517944 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.518005 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-catalog-content\") pod \"redhat-marketplace-wlccm\" (UID: \"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e\") " pod="openshift-marketplace/redhat-marketplace-wlccm" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.518036 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cksgt\" (UniqueName: \"kubernetes.io/projected/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-kube-api-access-cksgt\") pod \"redhat-marketplace-wlccm\" (UID: \"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e\") " pod="openshift-marketplace/redhat-marketplace-wlccm" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.518076 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/696eed68-bf2d-4bbd-865f-07998d61f8ab-utilities\") pod \"redhat-marketplace-kp7rb\" (UID: \"696eed68-bf2d-4bbd-865f-07998d61f8ab\") " pod="openshift-marketplace/redhat-marketplace-kp7rb" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.518105 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-utilities\") pod \"redhat-marketplace-wlccm\" (UID: \"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e\") " pod="openshift-marketplace/redhat-marketplace-wlccm" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.518132 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/696eed68-bf2d-4bbd-865f-07998d61f8ab-catalog-content\") pod \"redhat-marketplace-kp7rb\" (UID: \"696eed68-bf2d-4bbd-865f-07998d61f8ab\") " pod="openshift-marketplace/redhat-marketplace-kp7rb" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.518153 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vwwx5\" (UniqueName: \"kubernetes.io/projected/696eed68-bf2d-4bbd-865f-07998d61f8ab-kube-api-access-vwwx5\") pod \"redhat-marketplace-kp7rb\" (UID: \"696eed68-bf2d-4bbd-865f-07998d61f8ab\") " pod="openshift-marketplace/redhat-marketplace-kp7rb" Mar 22 00:10:52 crc kubenswrapper[5116]: E0322 00:10:52.519669 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:53.019652943 +0000 UTC m=+124.041954316 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.519750 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/696eed68-bf2d-4bbd-865f-07998d61f8ab-utilities\") pod \"redhat-marketplace-kp7rb\" (UID: \"696eed68-bf2d-4bbd-865f-07998d61f8ab\") " pod="openshift-marketplace/redhat-marketplace-kp7rb" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.519982 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/696eed68-bf2d-4bbd-865f-07998d61f8ab-catalog-content\") pod \"redhat-marketplace-kp7rb\" (UID: \"696eed68-bf2d-4bbd-865f-07998d61f8ab\") " pod="openshift-marketplace/redhat-marketplace-kp7rb" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.544989 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-6-crc" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.559709 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwwx5\" (UniqueName: \"kubernetes.io/projected/696eed68-bf2d-4bbd-865f-07998d61f8ab-kube-api-access-vwwx5\") pod \"redhat-marketplace-kp7rb\" (UID: \"696eed68-bf2d-4bbd-865f-07998d61f8ab\") " pod="openshift-marketplace/redhat-marketplace-kp7rb" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.562346 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wss9d"] Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.562531 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wss9d" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.569300 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"redhat-operators-dockercfg-9gxlh\"" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.619548 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:52 crc kubenswrapper[5116]: E0322 00:10:52.619773 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:53.119735599 +0000 UTC m=+124.142036992 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.619891 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/06481163-cdc1-4b43-b6c2-73f2672feb42-kubelet-dir\") pod \"06481163-cdc1-4b43-b6c2-73f2672feb42\" (UID: \"06481163-cdc1-4b43-b6c2-73f2672feb42\") " Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.620030 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/06481163-cdc1-4b43-b6c2-73f2672feb42-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "06481163-cdc1-4b43-b6c2-73f2672feb42" (UID: "06481163-cdc1-4b43-b6c2-73f2672feb42"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.620073 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06481163-cdc1-4b43-b6c2-73f2672feb42-kube-api-access\") pod \"06481163-cdc1-4b43-b6c2-73f2672feb42\" (UID: \"06481163-cdc1-4b43-b6c2-73f2672feb42\") " Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.620243 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.620291 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8ktl\" (UniqueName: \"kubernetes.io/projected/fe41a890-8a59-4fc7-b392-b7bab2ad5832-kube-api-access-x8ktl\") pod \"redhat-operators-wss9d\" (UID: \"fe41a890-8a59-4fc7-b392-b7bab2ad5832\") " pod="openshift-marketplace/redhat-operators-wss9d" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.620344 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-catalog-content\") pod \"redhat-marketplace-wlccm\" (UID: \"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e\") " pod="openshift-marketplace/redhat-marketplace-wlccm" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.620392 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cksgt\" (UniqueName: \"kubernetes.io/projected/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-kube-api-access-cksgt\") pod \"redhat-marketplace-wlccm\" (UID: \"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e\") " pod="openshift-marketplace/redhat-marketplace-wlccm" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.620486 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-utilities\") pod \"redhat-marketplace-wlccm\" (UID: \"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e\") " pod="openshift-marketplace/redhat-marketplace-wlccm" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.620530 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe41a890-8a59-4fc7-b392-b7bab2ad5832-utilities\") pod \"redhat-operators-wss9d\" (UID: \"fe41a890-8a59-4fc7-b392-b7bab2ad5832\") " pod="openshift-marketplace/redhat-operators-wss9d" Mar 22 00:10:52 crc kubenswrapper[5116]: E0322 00:10:52.620560 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:53.120543394 +0000 UTC m=+124.142844757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.620827 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe41a890-8a59-4fc7-b392-b7bab2ad5832-catalog-content\") pod \"redhat-operators-wss9d\" (UID: \"fe41a890-8a59-4fc7-b392-b7bab2ad5832\") " pod="openshift-marketplace/redhat-operators-wss9d" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.620834 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-catalog-content\") pod \"redhat-marketplace-wlccm\" (UID: \"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e\") " pod="openshift-marketplace/redhat-marketplace-wlccm" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.621050 5116 reconciler_common.go:299] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/06481163-cdc1-4b43-b6c2-73f2672feb42-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.621334 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-utilities\") pod \"redhat-marketplace-wlccm\" (UID: \"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e\") " pod="openshift-marketplace/redhat-marketplace-wlccm" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.627855 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06481163-cdc1-4b43-b6c2-73f2672feb42-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "06481163-cdc1-4b43-b6c2-73f2672feb42" (UID: "06481163-cdc1-4b43-b6c2-73f2672feb42"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.639646 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cksgt\" (UniqueName: \"kubernetes.io/projected/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-kube-api-access-cksgt\") pod \"redhat-marketplace-wlccm\" (UID: \"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e\") " pod="openshift-marketplace/redhat-marketplace-wlccm" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.669207 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fbgnq"] Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.683357 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="06481163-cdc1-4b43-b6c2-73f2672feb42" containerName="pruner" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.683395 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="06481163-cdc1-4b43-b6c2-73f2672feb42" containerName="pruner" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.683690 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="06481163-cdc1-4b43-b6c2-73f2672feb42" containerName="pruner" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.685758 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kp7rb" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.723282 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:52 crc kubenswrapper[5116]: E0322 00:10:52.723620 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:53.223596655 +0000 UTC m=+124.245898028 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.726185 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe41a890-8a59-4fc7-b392-b7bab2ad5832-utilities\") pod \"redhat-operators-wss9d\" (UID: \"fe41a890-8a59-4fc7-b392-b7bab2ad5832\") " pod="openshift-marketplace/redhat-operators-wss9d" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.726376 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe41a890-8a59-4fc7-b392-b7bab2ad5832-catalog-content\") pod \"redhat-operators-wss9d\" (UID: \"fe41a890-8a59-4fc7-b392-b7bab2ad5832\") " pod="openshift-marketplace/redhat-operators-wss9d" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.726544 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.726649 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x8ktl\" (UniqueName: \"kubernetes.io/projected/fe41a890-8a59-4fc7-b392-b7bab2ad5832-kube-api-access-x8ktl\") pod \"redhat-operators-wss9d\" (UID: \"fe41a890-8a59-4fc7-b392-b7bab2ad5832\") " pod="openshift-marketplace/redhat-operators-wss9d" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.726798 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06481163-cdc1-4b43-b6c2-73f2672feb42-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.727608 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe41a890-8a59-4fc7-b392-b7bab2ad5832-utilities\") pod \"redhat-operators-wss9d\" (UID: \"fe41a890-8a59-4fc7-b392-b7bab2ad5832\") " pod="openshift-marketplace/redhat-operators-wss9d" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.727933 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe41a890-8a59-4fc7-b392-b7bab2ad5832-catalog-content\") pod \"redhat-operators-wss9d\" (UID: \"fe41a890-8a59-4fc7-b392-b7bab2ad5832\") " pod="openshift-marketplace/redhat-operators-wss9d" Mar 22 00:10:52 crc kubenswrapper[5116]: E0322 00:10:52.728390 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:53.228375296 +0000 UTC m=+124.250676669 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.747643 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8ktl\" (UniqueName: \"kubernetes.io/projected/fe41a890-8a59-4fc7-b392-b7bab2ad5832-kube-api-access-x8ktl\") pod \"redhat-operators-wss9d\" (UID: \"fe41a890-8a59-4fc7-b392-b7bab2ad5832\") " pod="openshift-marketplace/redhat-operators-wss9d" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.789706 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wlccm" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.828645 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:52 crc kubenswrapper[5116]: E0322 00:10:52.828898 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:53.328870855 +0000 UTC m=+124.351172238 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.829377 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:52 crc kubenswrapper[5116]: E0322 00:10:52.829899 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:53.329890528 +0000 UTC m=+124.352191901 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.890715 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wss9d" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.903413 5116 generic.go:358] "Generic (PLEG): container finished" podID="09fbdb0d-3da3-4d36-9a96-4ed0caa53799" containerID="bbda729440934768bb562f9328428c91354391de8fb49c00d7a6e4a343b96867" exitCode=0 Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.930787 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:52 crc kubenswrapper[5116]: E0322 00:10:52.931092 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:53.431065557 +0000 UTC m=+124.453366930 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.931496 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:52 crc kubenswrapper[5116]: E0322 00:10:52.931873 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:53.431851873 +0000 UTC m=+124.454153306 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.934643 5116 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-2jlxw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 22 00:10:52 crc kubenswrapper[5116]: [-]has-synced failed: reason withheld Mar 22 00:10:52 crc kubenswrapper[5116]: [+]process-running ok Mar 22 00:10:52 crc kubenswrapper[5116]: healthz check failed Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.934706 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" podUID="00dffd10-d567-431f-8dd9-390443f26d96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.033269 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.033561 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:53.533527059 +0000 UTC m=+124.555828442 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.034039 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.034367 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:53.534353556 +0000 UTC m=+124.556654929 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.057214 5116 ???:1] "http: TLS handshake error from 192.168.126.11:48436: no serving certificate available for the kubelet" Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.135797 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.136250 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:53.636229998 +0000 UTC m=+124.658531371 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.237636 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.237978 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:53.737965236 +0000 UTC m=+124.760266609 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.339048 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.339225 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:53.839197349 +0000 UTC m=+124.861498722 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.339467 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.339764 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:53.839751246 +0000 UTC m=+124.862052619 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.440527 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.440834 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:53.940783282 +0000 UTC m=+124.963084655 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.441393 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.441718 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:53.941702301 +0000 UTC m=+124.964003674 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.542469 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.542692 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.042656014 +0000 UTC m=+125.064957407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.543101 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.543465 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.04344877 +0000 UTC m=+125.065750213 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.643914 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.644106 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.144078893 +0000 UTC m=+125.166380266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.644603 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.644959 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.14494254 +0000 UTC m=+125.167243913 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.746356 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.746581 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.246548324 +0000 UTC m=+125.268849707 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.746777 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.747138 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.247119742 +0000 UTC m=+125.269421115 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.847607 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.847837 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.347807577 +0000 UTC m=+125.370108950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.847941 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.848352 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.348334144 +0000 UTC m=+125.370635517 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.934851 5116 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-2jlxw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 22 00:10:53 crc kubenswrapper[5116]: [-]has-synced failed: reason withheld Mar 22 00:10:53 crc kubenswrapper[5116]: [+]process-running ok Mar 22 00:10:53 crc kubenswrapper[5116]: healthz check failed Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.934926 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" podUID="00dffd10-d567-431f-8dd9-390443f26d96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.949525 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.949712 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.44967492 +0000 UTC m=+125.471976293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.950341 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.950713 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.450698082 +0000 UTC m=+125.472999455 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.051462 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.051657 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.551629515 +0000 UTC m=+125.573930888 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.051937 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.052254 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.552242384 +0000 UTC m=+125.574543757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.116694 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-6-crc" event={"ID":"06481163-cdc1-4b43-b6c2-73f2672feb42","Type":"ContainerDied","Data":"5fd306e69fede60242c9ecd2db58c2c9e041c89fdb800077d2c6965a559bbb92"} Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.116748 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fd306e69fede60242c9ecd2db58c2c9e041c89fdb800077d2c6965a559bbb92" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.116806 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-6-crc" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.116998 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbgnq" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.125037 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fbgnq"] Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.125080 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wlccm"] Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.125092 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-npbn6" event={"ID":"09fbdb0d-3da3-4d36-9a96-4ed0caa53799","Type":"ContainerDied","Data":"bbda729440934768bb562f9328428c91354391de8fb49c00d7a6e4a343b96867"} Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.125115 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kp7rb"] Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.125134 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wss9d"] Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.125153 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mzck5" event={"ID":"1bb9e03f-ef85-4dbe-802f-d529e97b092c","Type":"ContainerStarted","Data":"75f960061cbc0329dfd322c24b6916521266afa146009455e9c761967544b284"} Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.125186 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-11-crc"] Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.153413 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.153609 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.65357684 +0000 UTC m=+125.675878213 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.153731 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.154118 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.654102017 +0000 UTC m=+125.676403490 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.254744 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.254921 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.754891195 +0000 UTC m=+125.777192568 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.255268 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-utilities\") pod \"redhat-operators-fbgnq\" (UID: \"29adb7c6-6fa5-4af7-9007-dc22cf4598e7\") " pod="openshift-marketplace/redhat-operators-fbgnq" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.255303 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-catalog-content\") pod \"redhat-operators-fbgnq\" (UID: \"29adb7c6-6fa5-4af7-9007-dc22cf4598e7\") " pod="openshift-marketplace/redhat-operators-fbgnq" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.255344 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvgbs\" (UniqueName: \"kubernetes.io/projected/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-kube-api-access-rvgbs\") pod \"redhat-operators-fbgnq\" (UID: \"29adb7c6-6fa5-4af7-9007-dc22cf4598e7\") " pod="openshift-marketplace/redhat-operators-fbgnq" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.255826 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.256211 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.756188926 +0000 UTC m=+125.778490309 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.356751 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.356924 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.856894711 +0000 UTC m=+125.879196084 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.357365 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.357438 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-utilities\") pod \"redhat-operators-fbgnq\" (UID: \"29adb7c6-6fa5-4af7-9007-dc22cf4598e7\") " pod="openshift-marketplace/redhat-operators-fbgnq" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.357471 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-catalog-content\") pod \"redhat-operators-fbgnq\" (UID: \"29adb7c6-6fa5-4af7-9007-dc22cf4598e7\") " pod="openshift-marketplace/redhat-operators-fbgnq" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.357513 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvgbs\" (UniqueName: \"kubernetes.io/projected/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-kube-api-access-rvgbs\") pod \"redhat-operators-fbgnq\" (UID: \"29adb7c6-6fa5-4af7-9007-dc22cf4598e7\") " pod="openshift-marketplace/redhat-operators-fbgnq" Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.357884 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.857867572 +0000 UTC m=+125.880168945 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.357888 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-utilities\") pod \"redhat-operators-fbgnq\" (UID: \"29adb7c6-6fa5-4af7-9007-dc22cf4598e7\") " pod="openshift-marketplace/redhat-operators-fbgnq" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.358026 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-catalog-content\") pod \"redhat-operators-fbgnq\" (UID: \"29adb7c6-6fa5-4af7-9007-dc22cf4598e7\") " pod="openshift-marketplace/redhat-operators-fbgnq" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.376807 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvgbs\" (UniqueName: \"kubernetes.io/projected/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-kube-api-access-rvgbs\") pod \"redhat-operators-fbgnq\" (UID: \"29adb7c6-6fa5-4af7-9007-dc22cf4598e7\") " pod="openshift-marketplace/redhat-operators-fbgnq" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.430590 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wlccm" event={"ID":"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e","Type":"ContainerStarted","Data":"96be2eef25c44ecc457a8a1fa10ad35be205dda004792bd0cccdb43de654bdc4"} Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.430652 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kp7rb" event={"ID":"696eed68-bf2d-4bbd-865f-07998d61f8ab","Type":"ContainerStarted","Data":"82447639f8664a7a9be68c50329975aae58c8919cedaf2554c6f5ebb2a14ac22"} Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.430676 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wss9d" event={"ID":"fe41a890-8a59-4fc7-b392-b7bab2ad5832","Type":"ContainerStarted","Data":"4076d8a97891e463c22fe9847edcf67c692b44a8dbcd9aa75ba00b5c2c7fdc81"} Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.430694 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-11-crc"] Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.430805 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-11-crc" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.432625 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-apiserver\"/\"installer-sa-dockercfg-bqqnb\"" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.432642 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-apiserver\"/\"kube-root-ca.crt\"" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.439862 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.440802 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbgnq" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.460000 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.460367 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.960350144 +0000 UTC m=+125.982651517 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.520863 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.563470 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/087356d8-050c-4861-8799-76df7a8330cb-kubelet-dir\") pod \"revision-pruner-11-crc\" (UID: \"087356d8-050c-4861-8799-76df7a8330cb\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.563532 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/087356d8-050c-4861-8799-76df7a8330cb-kube-api-access\") pod \"revision-pruner-11-crc\" (UID: \"087356d8-050c-4861-8799-76df7a8330cb\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.563563 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7w75\" (UniqueName: \"kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75\") pod \"network-check-source-5bb8f5cd97-xdvz5\" (UID: \"f863fff9-286a-45fa-b8f0-8a86994b8440\") " pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.563602 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.564677 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.564788 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.564991 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwt8b\" (UniqueName: \"kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b\") pod \"network-check-target-fhkjl\" (UID: \"17b87002-b798-480a-8e17-83053d698239\") " pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.567609 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:55.067585697 +0000 UTC m=+126.089887070 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.568006 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.569047 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.571737 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.579589 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.587056 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.602400 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.608472 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwt8b\" (UniqueName: \"kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b\") pod \"network-check-target-fhkjl\" (UID: \"17b87002-b798-480a-8e17-83053d698239\") " pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.609271 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7w75\" (UniqueName: \"kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75\") pod \"network-check-source-5bb8f5cd97-xdvz5\" (UID: \"f863fff9-286a-45fa-b8f0-8a86994b8440\") " pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.622487 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.641483 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.653369 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.666081 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.666353 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:55.16632136 +0000 UTC m=+126.188622733 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.666484 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.667407 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/087356d8-050c-4861-8799-76df7a8330cb-kubelet-dir\") pod \"revision-pruner-11-crc\" (UID: \"087356d8-050c-4861-8799-76df7a8330cb\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.667459 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/087356d8-050c-4861-8799-76df7a8330cb-kube-api-access\") pod \"revision-pruner-11-crc\" (UID: \"087356d8-050c-4861-8799-76df7a8330cb\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.668868 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:55.16885422 +0000 UTC m=+126.191155593 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.669061 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/087356d8-050c-4861-8799-76df7a8330cb-kubelet-dir\") pod \"revision-pruner-11-crc\" (UID: \"087356d8-050c-4861-8799-76df7a8330cb\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.694967 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/087356d8-050c-4861-8799-76df7a8330cb-kube-api-access\") pod \"revision-pruner-11-crc\" (UID: \"087356d8-050c-4861-8799-76df7a8330cb\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.768843 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.769448 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:55.269431812 +0000 UTC m=+126.291733185 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.835544 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-11-crc" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.854499 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fbgnq"] Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.871156 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.872706 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:55.372684848 +0000 UTC m=+126.394986221 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.945379 5116 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-2jlxw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 22 00:10:54 crc kubenswrapper[5116]: [-]has-synced failed: reason withheld Mar 22 00:10:54 crc kubenswrapper[5116]: [+]process-running ok Mar 22 00:10:54 crc kubenswrapper[5116]: healthz check failed Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.945433 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" podUID="00dffd10-d567-431f-8dd9-390443f26d96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.949781 5116 generic.go:358] "Generic (PLEG): container finished" podID="77aaaf4b-1fdf-4c49-9e45-86aabb6f007e" containerID="8c0f75cea0a92631a5a45b839d9a2769abea056ad176859d98334cc88e86bfd5" exitCode=0 Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.949887 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wlccm" event={"ID":"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e","Type":"ContainerDied","Data":"8c0f75cea0a92631a5a45b839d9a2769abea056ad176859d98334cc88e86bfd5"} Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.969965 5116 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a68815d0fdf2b53a6a0e9f938fc585a599e79f4f7ad52f8b29c63ba519d0d23e" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.970196 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kp7rb" event={"ID":"696eed68-bf2d-4bbd-865f-07998d61f8ab","Type":"ContainerStarted","Data":"60acf1b394737b7d397286a07410ebda0e8083a98b6b46d3e3761b9f5dd3c90c"} Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.972578 5116 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a68815d0fdf2b53a6a0e9f938fc585a599e79f4f7ad52f8b29c63ba519d0d23e" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.973314 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.973481 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:55.473455996 +0000 UTC m=+126.495757379 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.974133 5116 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a68815d0fdf2b53a6a0e9f938fc585a599e79f4f7ad52f8b29c63ba519d0d23e" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.974187 5116 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" podUID="f1d2a94f-b4d4-4cdc-b862-a4866cadaea1" containerName="kube-multus-additional-cni-plugins" probeResult="unknown" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.974449 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.974733 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:55.474722535 +0000 UTC m=+126.497023908 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.075747 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:55 crc kubenswrapper[5116]: E0322 00:10:55.075899 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:55.575877375 +0000 UTC m=+126.598178748 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.077008 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:55 crc kubenswrapper[5116]: E0322 00:10:55.078957 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:55.578937052 +0000 UTC m=+126.601238425 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.142085 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-11-crc"] Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.180504 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:55 crc kubenswrapper[5116]: E0322 00:10:55.180799 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:55.680766444 +0000 UTC m=+126.703067817 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.287485 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:55 crc kubenswrapper[5116]: E0322 00:10:55.288113 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:55.78809852 +0000 UTC m=+126.810399893 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.347528 5116 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.363716 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.390627 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:55 crc kubenswrapper[5116]: E0322 00:10:55.390983 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:55.890962124 +0000 UTC m=+126.913263497 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.492886 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:55 crc kubenswrapper[5116]: E0322 00:10:55.493214 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:55.993198497 +0000 UTC m=+127.015499870 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.593666 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:55 crc kubenswrapper[5116]: E0322 00:10:55.593897 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:56.093880522 +0000 UTC m=+127.116181895 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.693736 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.694727 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:55 crc kubenswrapper[5116]: E0322 00:10:55.694986 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:56.19497477 +0000 UTC m=+127.217276143 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.695936 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.717073 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-rkswl" Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.717134 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-67c89758df-vnd4f" Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.717184 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.803800 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:55 crc kubenswrapper[5116]: E0322 00:10:55.804954 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:56.30493441 +0000 UTC m=+127.327235793 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.806679 5116 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-22T00:10:55.347558696Z","UUID":"eb6ef366-15c5-4ad3-9363-4593b9e8b329","Handler":null,"Name":"","Endpoint":""} Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.809827 5116 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.809857 5116 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.906882 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.912945 5116 csi_attacher.go:373] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.912984 5116 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b1264ac67579ad07e7e9003054d44fe40dd55285a4b2f7dc74e48be1aee0868a/globalmount\"" pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.936350 5116 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-2jlxw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 22 00:10:55 crc kubenswrapper[5116]: [-]has-synced failed: reason withheld Mar 22 00:10:55 crc kubenswrapper[5116]: [+]process-running ok Mar 22 00:10:55 crc kubenswrapper[5116]: healthz check failed Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.936608 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" podUID="00dffd10-d567-431f-8dd9-390443f26d96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.992049 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.001971 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-11-crc" event={"ID":"087356d8-050c-4861-8799-76df7a8330cb","Type":"ContainerStarted","Data":"130739c8c4fc9560744543b32a2dedebe00df0b00c06be853ce09fafeeaac564"} Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.002054 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-11-crc" event={"ID":"087356d8-050c-4861-8799-76df7a8330cb","Type":"ContainerStarted","Data":"25a14810c25178d5701d5a77da67d4d6de2709a0d6bd4a338a91bc57a5e63c27"} Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.013393 5116 generic.go:358] "Generic (PLEG): container finished" podID="696eed68-bf2d-4bbd-865f-07998d61f8ab" containerID="60acf1b394737b7d397286a07410ebda0e8083a98b6b46d3e3761b9f5dd3c90c" exitCode=0 Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.013559 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kp7rb" event={"ID":"696eed68-bf2d-4bbd-865f-07998d61f8ab","Type":"ContainerDied","Data":"60acf1b394737b7d397286a07410ebda0e8083a98b6b46d3e3761b9f5dd3c90c"} Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.013681 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.022114 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" event={"ID":"17b87002-b798-480a-8e17-83053d698239","Type":"ContainerStarted","Data":"936d15425a95ca7340bf36b5d29e85e0b504aee88e45479073a76084b212cad1"} Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.022321 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" event={"ID":"17b87002-b798-480a-8e17-83053d698239","Type":"ContainerStarted","Data":"51467a890456af3ec283aae28bac52ca164efa856026a21eda41dfc79e8cffb4"} Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.022840 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.025471 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" event={"ID":"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141","Type":"ContainerStarted","Data":"037bb9ee1a61b40ccd405d0e4fad4e60c2db137b49afe4bcb9c35c5d0f724be3"} Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.025502 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" event={"ID":"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141","Type":"ContainerStarted","Data":"b16e7159c9d6254a30e19b529014e9d9930a702bf13c8c0364e78a839e55a397"} Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.028520 5116 generic.go:358] "Generic (PLEG): container finished" podID="fe41a890-8a59-4fc7-b392-b7bab2ad5832" containerID="f6ccc3cf8e5e1fac21a450937d818f73d9c8ea21d213cc087495650a551817ba" exitCode=0 Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.028627 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wss9d" event={"ID":"fe41a890-8a59-4fc7-b392-b7bab2ad5832","Type":"ContainerDied","Data":"f6ccc3cf8e5e1fac21a450937d818f73d9c8ea21d213cc087495650a551817ba"} Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.033622 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (OuterVolumeSpecName: "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af"). InnerVolumeSpecName "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2". PluginName "kubernetes.io/csi", VolumeGIDValue "" Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.033701 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" event={"ID":"f863fff9-286a-45fa-b8f0-8a86994b8440","Type":"ContainerStarted","Data":"e6c5bea9360e0d6bf8238510407377ca2564dff36e62aa24d3ae073673381431"} Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.033752 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" event={"ID":"f863fff9-286a-45fa-b8f0-8a86994b8440","Type":"ContainerStarted","Data":"a0d97d493c5b4a9902109682c624deddf807b1f2688a8dd2bc5d8fbe7851a740"} Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.046207 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-11-crc" podStartSLOduration=3.046160594 podStartE2EDuration="3.046160594s" podCreationTimestamp="2026-03-22 00:10:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:56.032686147 +0000 UTC m=+127.054987520" watchObservedRunningTime="2026-03-22 00:10:56.046160594 +0000 UTC m=+127.068461967" Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.052038 5116 generic.go:358] "Generic (PLEG): container finished" podID="29adb7c6-6fa5-4af7-9007-dc22cf4598e7" containerID="95e390561446d32b93dd2b4ee1d879e784f5ecc936c45e238f849f636fbfb5fe" exitCode=0 Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.052330 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbgnq" event={"ID":"29adb7c6-6fa5-4af7-9007-dc22cf4598e7","Type":"ContainerDied","Data":"95e390561446d32b93dd2b4ee1d879e784f5ecc936c45e238f849f636fbfb5fe"} Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.052391 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbgnq" event={"ID":"29adb7c6-6fa5-4af7-9007-dc22cf4598e7","Type":"ContainerStarted","Data":"5618b991bbeac41e6785299c47466f86c4e18d2882dd3d7064face6121177c93"} Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.061640 5116 generic.go:358] "Generic (PLEG): container finished" podID="113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1" containerID="cce1736020766020678b302f9485885375b60ee80f25f29a702ff1bc84c0c923" exitCode=0 Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.061745 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" event={"ID":"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1","Type":"ContainerDied","Data":"cce1736020766020678b302f9485885375b60ee80f25f29a702ff1bc84c0c923"} Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.068060 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mzck5" event={"ID":"1bb9e03f-ef85-4dbe-802f-d529e97b092c","Type":"ContainerStarted","Data":"5aba16f642e38c56ebd13cbed0fe8ffe84084bc59fa041b240ad7c62484dc1fe"} Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.218056 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6w67b\"" Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.227280 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.701731 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.772388 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66587d64c8-zwkhp"] Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.934719 5116 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-2jlxw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 22 00:10:56 crc kubenswrapper[5116]: [-]has-synced failed: reason withheld Mar 22 00:10:56 crc kubenswrapper[5116]: [+]process-running ok Mar 22 00:10:56 crc kubenswrapper[5116]: healthz check failed Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.934784 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" podUID="00dffd10-d567-431f-8dd9-390443f26d96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.080010 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mzck5" event={"ID":"1bb9e03f-ef85-4dbe-802f-d529e97b092c","Type":"ContainerStarted","Data":"28cf8da7c74cf08d2480113a46e665cb47e7c90c0ce62edd51dce536553dc5c7"} Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.081141 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" event={"ID":"36ff6a0d-ec37-48dd-9e2b-01bcb5755738","Type":"ContainerStarted","Data":"bca4de0caba9859b14c3f0eb17a3776e71425e24f9833420070510404cc3406c"} Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.082698 5116 generic.go:358] "Generic (PLEG): container finished" podID="087356d8-050c-4861-8799-76df7a8330cb" containerID="130739c8c4fc9560744543b32a2dedebe00df0b00c06be853ce09fafeeaac564" exitCode=0 Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.082887 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-11-crc" event={"ID":"087356d8-050c-4861-8799-76df7a8330cb","Type":"ContainerDied","Data":"130739c8c4fc9560744543b32a2dedebe00df0b00c06be853ce09fafeeaac564"} Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.123876 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-mzck5" podStartSLOduration=20.123853971 podStartE2EDuration="20.123853971s" podCreationTimestamp="2026-03-22 00:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:57.104007611 +0000 UTC m=+128.126308984" watchObservedRunningTime="2026-03-22 00:10:57.123853971 +0000 UTC m=+128.146155344" Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.355478 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.372542 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-config-volume\") pod \"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1\" (UID: \"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1\") " Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.372719 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clsgn\" (UniqueName: \"kubernetes.io/projected/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-kube-api-access-clsgn\") pod \"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1\" (UID: \"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1\") " Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.372752 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-secret-volume\") pod \"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1\" (UID: \"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1\") " Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.373664 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-config-volume" (OuterVolumeSpecName: "config-volume") pod "113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1" (UID: "113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.379827 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-kube-api-access-clsgn" (OuterVolumeSpecName: "kube-api-access-clsgn") pod "113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1" (UID: "113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1"). InnerVolumeSpecName "kube-api-access-clsgn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.380375 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1" (UID: "113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.474822 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-clsgn\" (UniqueName: \"kubernetes.io/projected/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-kube-api-access-clsgn\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.474864 5116 reconciler_common.go:299] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.474877 5116 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-config-volume\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.718031 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e9b5059-1b3e-4067-a63d-2952cbe863af" path="/var/lib/kubelet/pods/9e9b5059-1b3e-4067-a63d-2952cbe863af/volumes" Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.943114 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.948059 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:58 crc kubenswrapper[5116]: I0322 00:10:58.090734 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" event={"ID":"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1","Type":"ContainerDied","Data":"dae1818a1082f52b271c6e67aa4b0cda2f80f759d34e5d7f0f380b84d59344e9"} Mar 22 00:10:58 crc kubenswrapper[5116]: I0322 00:10:58.090779 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dae1818a1082f52b271c6e67aa4b0cda2f80f759d34e5d7f0f380b84d59344e9" Mar 22 00:10:58 crc kubenswrapper[5116]: I0322 00:10:58.090883 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" Mar 22 00:10:58 crc kubenswrapper[5116]: I0322 00:10:58.095352 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" event={"ID":"36ff6a0d-ec37-48dd-9e2b-01bcb5755738","Type":"ContainerStarted","Data":"c568767e3e8c6d00f1721bf03851b9ec54aeb271e4894892ac0cc16a9a33722c"} Mar 22 00:10:58 crc kubenswrapper[5116]: I0322 00:10:58.096200 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:58 crc kubenswrapper[5116]: I0322 00:10:58.116193 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" podStartSLOduration=108.116110596 podStartE2EDuration="1m48.116110596s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:58.113567945 +0000 UTC m=+129.135869338" watchObservedRunningTime="2026-03-22 00:10:58.116110596 +0000 UTC m=+129.138411969" Mar 22 00:10:59 crc kubenswrapper[5116]: I0322 00:10:59.556502 5116 patch_prober.go:28] interesting pod/downloads-747b44746d-cb5p2 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Mar 22 00:10:59 crc kubenswrapper[5116]: I0322 00:10:59.556902 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-747b44746d-cb5p2" podUID="45fa64e1-27bb-4b1f-bf62-4fa08b5dcfa0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Mar 22 00:11:00 crc kubenswrapper[5116]: I0322 00:11:00.462502 5116 patch_prober.go:28] interesting pod/console-64d44f6ddf-9g5sg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Mar 22 00:11:00 crc kubenswrapper[5116]: I0322 00:11:00.462570 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-64d44f6ddf-9g5sg" podUID="4c2755ce-817d-47b0-9f19-7218641d0c5b" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Mar 22 00:11:00 crc kubenswrapper[5116]: I0322 00:11:00.839099 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-11-crc" Mar 22 00:11:00 crc kubenswrapper[5116]: I0322 00:11:00.878075 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:11:01 crc kubenswrapper[5116]: I0322 00:11:01.026254 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/087356d8-050c-4861-8799-76df7a8330cb-kubelet-dir\") pod \"087356d8-050c-4861-8799-76df7a8330cb\" (UID: \"087356d8-050c-4861-8799-76df7a8330cb\") " Mar 22 00:11:01 crc kubenswrapper[5116]: I0322 00:11:01.026402 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/087356d8-050c-4861-8799-76df7a8330cb-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "087356d8-050c-4861-8799-76df7a8330cb" (UID: "087356d8-050c-4861-8799-76df7a8330cb"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:11:01 crc kubenswrapper[5116]: I0322 00:11:01.026508 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/087356d8-050c-4861-8799-76df7a8330cb-kube-api-access\") pod \"087356d8-050c-4861-8799-76df7a8330cb\" (UID: \"087356d8-050c-4861-8799-76df7a8330cb\") " Mar 22 00:11:01 crc kubenswrapper[5116]: I0322 00:11:01.026766 5116 reconciler_common.go:299] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/087356d8-050c-4861-8799-76df7a8330cb-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:01 crc kubenswrapper[5116]: I0322 00:11:01.034111 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/087356d8-050c-4861-8799-76df7a8330cb-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "087356d8-050c-4861-8799-76df7a8330cb" (UID: "087356d8-050c-4861-8799-76df7a8330cb"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:11:01 crc kubenswrapper[5116]: I0322 00:11:01.120432 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-11-crc" event={"ID":"087356d8-050c-4861-8799-76df7a8330cb","Type":"ContainerDied","Data":"25a14810c25178d5701d5a77da67d4d6de2709a0d6bd4a338a91bc57a5e63c27"} Mar 22 00:11:01 crc kubenswrapper[5116]: I0322 00:11:01.120480 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25a14810c25178d5701d5a77da67d4d6de2709a0d6bd4a338a91bc57a5e63c27" Mar 22 00:11:01 crc kubenswrapper[5116]: I0322 00:11:01.120600 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-11-crc" Mar 22 00:11:01 crc kubenswrapper[5116]: I0322 00:11:01.127414 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/087356d8-050c-4861-8799-76df7a8330cb-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:02 crc kubenswrapper[5116]: I0322 00:11:02.169136 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:11:02 crc kubenswrapper[5116]: I0322 00:11:02.497237 5116 patch_prober.go:28] interesting pod/downloads-747b44746d-cb5p2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Mar 22 00:11:02 crc kubenswrapper[5116]: I0322 00:11:02.497646 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-747b44746d-cb5p2" podUID="45fa64e1-27bb-4b1f-bf62-4fa08b5dcfa0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Mar 22 00:11:03 crc kubenswrapper[5116]: I0322 00:11:03.325792 5116 ???:1] "http: TLS handshake error from 192.168.126.11:52016: no serving certificate available for the kubelet" Mar 22 00:11:04 crc kubenswrapper[5116]: E0322 00:11:04.968322 5116 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a68815d0fdf2b53a6a0e9f938fc585a599e79f4f7ad52f8b29c63ba519d0d23e" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 22 00:11:04 crc kubenswrapper[5116]: E0322 00:11:04.970625 5116 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a68815d0fdf2b53a6a0e9f938fc585a599e79f4f7ad52f8b29c63ba519d0d23e" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 22 00:11:04 crc kubenswrapper[5116]: E0322 00:11:04.971683 5116 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a68815d0fdf2b53a6a0e9f938fc585a599e79f4f7ad52f8b29c63ba519d0d23e" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 22 00:11:04 crc kubenswrapper[5116]: E0322 00:11:04.971726 5116 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" podUID="f1d2a94f-b4d4-4cdc-b862-a4866cadaea1" containerName="kube-multus-additional-cni-plugins" probeResult="unknown" Mar 22 00:11:10 crc kubenswrapper[5116]: I0322 00:11:10.480600 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:11:10 crc kubenswrapper[5116]: I0322 00:11:10.491966 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:11:12 crc kubenswrapper[5116]: I0322 00:11:12.513650 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-747b44746d-cb5p2" Mar 22 00:11:14 crc kubenswrapper[5116]: E0322 00:11:14.968329 5116 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a68815d0fdf2b53a6a0e9f938fc585a599e79f4f7ad52f8b29c63ba519d0d23e" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 22 00:11:14 crc kubenswrapper[5116]: E0322 00:11:14.969576 5116 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a68815d0fdf2b53a6a0e9f938fc585a599e79f4f7ad52f8b29c63ba519d0d23e" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 22 00:11:14 crc kubenswrapper[5116]: E0322 00:11:14.970891 5116 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a68815d0fdf2b53a6a0e9f938fc585a599e79f4f7ad52f8b29c63ba519d0d23e" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 22 00:11:14 crc kubenswrapper[5116]: E0322 00:11:14.970931 5116 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" podUID="f1d2a94f-b4d4-4cdc-b862-a4866cadaea1" containerName="kube-multus-additional-cni-plugins" probeResult="unknown" Mar 22 00:11:15 crc kubenswrapper[5116]: W0322 00:11:15.735803 5116 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09fbdb0d_3da3_4d36_9a96_4ed0caa53799.slice/crio-conmon-bbda729440934768bb562f9328428c91354391de8fb49c00d7a6e4a343b96867.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09fbdb0d_3da3_4d36_9a96_4ed0caa53799.slice/crio-conmon-bbda729440934768bb562f9328428c91354391de8fb49c00d7a6e4a343b96867.scope: no such file or directory Mar 22 00:11:15 crc kubenswrapper[5116]: W0322 00:11:15.736094 5116 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09fbdb0d_3da3_4d36_9a96_4ed0caa53799.slice/crio-bbda729440934768bb562f9328428c91354391de8fb49c00d7a6e4a343b96867.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09fbdb0d_3da3_4d36_9a96_4ed0caa53799.slice/crio-bbda729440934768bb562f9328428c91354391de8fb49c00d7a6e4a343b96867.scope: no such file or directory Mar 22 00:11:15 crc kubenswrapper[5116]: W0322 00:11:15.750901 5116 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-pod087356d8_050c_4861_8799_76df7a8330cb.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-pod087356d8_050c_4861_8799_76df7a8330cb.slice: no such file or directory Mar 22 00:11:15 crc kubenswrapper[5116]: E0322 00:11:15.844334 5116 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda3b0eb3_e48f_4080_bfdc_522f18cf2876.slice/crio-48d478e51d471e4228cfd3bf987de6d6edc345926b49b645231a761cba1fbba7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod113ffd3f_0faf_40f9_b1ab_0c7b88fc90f1.slice/crio-conmon-cce1736020766020678b302f9485885375b60ee80f25f29a702ff1bc84c0c923.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77380b82_4c44_4cfd_a7b1_e77b060af507.slice/crio-650e2199c9bfdd8d8095c16f7e86df5e23a5cedd710ebf3a93a1b3818c4e5743.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod06481163_cdc1_4b43_b6c2_73f2672feb42.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod113ffd3f_0faf_40f9_b1ab_0c7b88fc90f1.slice/crio-cce1736020766020678b302f9485885375b60ee80f25f29a702ff1bc84c0c923.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d2a94f_b4d4_4cdc_b862_a4866cadaea1.slice/crio-a68815d0fdf2b53a6a0e9f938fc585a599e79f4f7ad52f8b29c63ba519d0d23e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77380b82_4c44_4cfd_a7b1_e77b060af507.slice/crio-conmon-650e2199c9bfdd8d8095c16f7e86df5e23a5cedd710ebf3a93a1b3818c4e5743.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod06481163_cdc1_4b43_b6c2_73f2672feb42.slice/crio-5fd306e69fede60242c9ecd2db58c2c9e041c89fdb800077d2c6965a559bbb92\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod113ffd3f_0faf_40f9_b1ab_0c7b88fc90f1.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda3b0eb3_e48f_4080_bfdc_522f18cf2876.slice/crio-conmon-48d478e51d471e4228cfd3bf987de6d6edc345926b49b645231a761cba1fbba7.scope\": RecentStats: unable to find data in memory cache]" Mar 22 00:11:16 crc kubenswrapper[5116]: I0322 00:11:16.500615 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-57nbs_f1d2a94f-b4d4-4cdc-b862-a4866cadaea1/kube-multus-additional-cni-plugins/0.log" Mar 22 00:11:16 crc kubenswrapper[5116]: I0322 00:11:16.501441 5116 generic.go:358] "Generic (PLEG): container finished" podID="f1d2a94f-b4d4-4cdc-b862-a4866cadaea1" containerID="a68815d0fdf2b53a6a0e9f938fc585a599e79f4f7ad52f8b29c63ba519d0d23e" exitCode=137 Mar 22 00:11:16 crc kubenswrapper[5116]: I0322 00:11:16.501566 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" event={"ID":"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1","Type":"ContainerDied","Data":"a68815d0fdf2b53a6a0e9f938fc585a599e79f4f7ad52f8b29c63ba519d0d23e"} Mar 22 00:11:16 crc kubenswrapper[5116]: I0322 00:11:16.701107 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-ms24k" Mar 22 00:11:20 crc kubenswrapper[5116]: I0322 00:11:20.111963 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:11:21 crc kubenswrapper[5116]: I0322 00:11:21.751034 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-57nbs_f1d2a94f-b4d4-4cdc-b862-a4866cadaea1/kube-multus-additional-cni-plugins/0.log" Mar 22 00:11:21 crc kubenswrapper[5116]: I0322 00:11:21.751128 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:11:21 crc kubenswrapper[5116]: I0322 00:11:21.767239 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-ready\") pod \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\" (UID: \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\") " Mar 22 00:11:21 crc kubenswrapper[5116]: I0322 00:11:21.767398 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mkbf\" (UniqueName: \"kubernetes.io/projected/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-kube-api-access-7mkbf\") pod \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\" (UID: \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\") " Mar 22 00:11:21 crc kubenswrapper[5116]: I0322 00:11:21.767465 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-cni-sysctl-allowlist\") pod \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\" (UID: \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\") " Mar 22 00:11:21 crc kubenswrapper[5116]: I0322 00:11:21.767529 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-tuning-conf-dir\") pod \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\" (UID: \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\") " Mar 22 00:11:21 crc kubenswrapper[5116]: I0322 00:11:21.767752 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "f1d2a94f-b4d4-4cdc-b862-a4866cadaea1" (UID: "f1d2a94f-b4d4-4cdc-b862-a4866cadaea1"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:11:21 crc kubenswrapper[5116]: I0322 00:11:21.768197 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-ready" (OuterVolumeSpecName: "ready") pod "f1d2a94f-b4d4-4cdc-b862-a4866cadaea1" (UID: "f1d2a94f-b4d4-4cdc-b862-a4866cadaea1"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:11:21 crc kubenswrapper[5116]: I0322 00:11:21.768577 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "f1d2a94f-b4d4-4cdc-b862-a4866cadaea1" (UID: "f1d2a94f-b4d4-4cdc-b862-a4866cadaea1"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:11:21 crc kubenswrapper[5116]: I0322 00:11:21.774986 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-kube-api-access-7mkbf" (OuterVolumeSpecName: "kube-api-access-7mkbf") pod "f1d2a94f-b4d4-4cdc-b862-a4866cadaea1" (UID: "f1d2a94f-b4d4-4cdc-b862-a4866cadaea1"). InnerVolumeSpecName "kube-api-access-7mkbf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:11:21 crc kubenswrapper[5116]: I0322 00:11:21.868986 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7mkbf\" (UniqueName: \"kubernetes.io/projected/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-kube-api-access-7mkbf\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:21 crc kubenswrapper[5116]: I0322 00:11:21.869279 5116 reconciler_common.go:299] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:21 crc kubenswrapper[5116]: I0322 00:11:21.869292 5116 reconciler_common.go:299] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:21 crc kubenswrapper[5116]: I0322 00:11:21.869328 5116 reconciler_common.go:299] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-ready\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:22 crc kubenswrapper[5116]: I0322 00:11:22.543096 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-57nbs_f1d2a94f-b4d4-4cdc-b862-a4866cadaea1/kube-multus-additional-cni-plugins/0.log" Mar 22 00:11:22 crc kubenswrapper[5116]: I0322 00:11:22.543893 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:11:22 crc kubenswrapper[5116]: I0322 00:11:22.545387 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" event={"ID":"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1","Type":"ContainerDied","Data":"03b66e449ae75da9ca6d01567116f2bb706bdaf3b8c1863bf017db47b855716d"} Mar 22 00:11:22 crc kubenswrapper[5116]: I0322 00:11:22.545440 5116 scope.go:117] "RemoveContainer" containerID="a68815d0fdf2b53a6a0e9f938fc585a599e79f4f7ad52f8b29c63ba519d0d23e" Mar 22 00:11:22 crc kubenswrapper[5116]: I0322 00:11:22.552087 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hppqv" event={"ID":"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70","Type":"ContainerStarted","Data":"96e41905d769f207cae9700db4a07b0da2c414c390460451815b728a092aab79"} Mar 22 00:11:22 crc kubenswrapper[5116]: I0322 00:11:22.595438 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-57nbs"] Mar 22 00:11:22 crc kubenswrapper[5116]: I0322 00:11:22.602232 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-57nbs"] Mar 22 00:11:23 crc kubenswrapper[5116]: I0322 00:11:23.559735 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wss9d" event={"ID":"fe41a890-8a59-4fc7-b392-b7bab2ad5832","Type":"ContainerStarted","Data":"4615b1e7e8eebd1c4efa8ba0e4d690678c50216687ca96316a048717b240afa6"} Mar 22 00:11:23 crc kubenswrapper[5116]: I0322 00:11:23.562885 5116 generic.go:358] "Generic (PLEG): container finished" podID="29adb7c6-6fa5-4af7-9007-dc22cf4598e7" containerID="b89e7acd7f87121f1cf68c7619f0385299e7a83f180ceb344d324886b920069e" exitCode=0 Mar 22 00:11:23 crc kubenswrapper[5116]: I0322 00:11:23.563000 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbgnq" event={"ID":"29adb7c6-6fa5-4af7-9007-dc22cf4598e7","Type":"ContainerDied","Data":"b89e7acd7f87121f1cf68c7619f0385299e7a83f180ceb344d324886b920069e"} Mar 22 00:11:23 crc kubenswrapper[5116]: I0322 00:11:23.569605 5116 generic.go:358] "Generic (PLEG): container finished" podID="77380b82-4c44-4cfd-a7b1-e77b060af507" containerID="3662e71bd41b60c7bbef1f51273ae388448fc2e3a846e9f692b29bbba4929dce" exitCode=0 Mar 22 00:11:23 crc kubenswrapper[5116]: I0322 00:11:23.569733 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrcmf" event={"ID":"77380b82-4c44-4cfd-a7b1-e77b060af507","Type":"ContainerDied","Data":"3662e71bd41b60c7bbef1f51273ae388448fc2e3a846e9f692b29bbba4929dce"} Mar 22 00:11:23 crc kubenswrapper[5116]: I0322 00:11:23.573515 5116 generic.go:358] "Generic (PLEG): container finished" podID="da3b0eb3-e48f-4080-bfdc-522f18cf2876" containerID="bd585f6a2418bf617978e44c5ded778fb5ab883949c7c6d99346b0cce7aab8d6" exitCode=0 Mar 22 00:11:23 crc kubenswrapper[5116]: I0322 00:11:23.573616 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4x6l" event={"ID":"da3b0eb3-e48f-4080-bfdc-522f18cf2876","Type":"ContainerDied","Data":"bd585f6a2418bf617978e44c5ded778fb5ab883949c7c6d99346b0cce7aab8d6"} Mar 22 00:11:23 crc kubenswrapper[5116]: I0322 00:11:23.590419 5116 generic.go:358] "Generic (PLEG): container finished" podID="09fbdb0d-3da3-4d36-9a96-4ed0caa53799" containerID="1bec1f695abda26181894012664861851610963e95a85fd4d9deba481ae99299" exitCode=0 Mar 22 00:11:23 crc kubenswrapper[5116]: I0322 00:11:23.590621 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-npbn6" event={"ID":"09fbdb0d-3da3-4d36-9a96-4ed0caa53799","Type":"ContainerDied","Data":"1bec1f695abda26181894012664861851610963e95a85fd4d9deba481ae99299"} Mar 22 00:11:23 crc kubenswrapper[5116]: I0322 00:11:23.592353 5116 generic.go:358] "Generic (PLEG): container finished" podID="1c6b741d-fb0d-4bb1-a050-a2e56bd11e70" containerID="96e41905d769f207cae9700db4a07b0da2c414c390460451815b728a092aab79" exitCode=0 Mar 22 00:11:23 crc kubenswrapper[5116]: I0322 00:11:23.592432 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hppqv" event={"ID":"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70","Type":"ContainerDied","Data":"96e41905d769f207cae9700db4a07b0da2c414c390460451815b728a092aab79"} Mar 22 00:11:23 crc kubenswrapper[5116]: I0322 00:11:23.596425 5116 generic.go:358] "Generic (PLEG): container finished" podID="77aaaf4b-1fdf-4c49-9e45-86aabb6f007e" containerID="9a57353f19b072378c022259ab5cd42c5b1dc1b97fbc286360c406f3e67d341a" exitCode=0 Mar 22 00:11:23 crc kubenswrapper[5116]: I0322 00:11:23.596795 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wlccm" event={"ID":"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e","Type":"ContainerDied","Data":"9a57353f19b072378c022259ab5cd42c5b1dc1b97fbc286360c406f3e67d341a"} Mar 22 00:11:23 crc kubenswrapper[5116]: I0322 00:11:23.606998 5116 generic.go:358] "Generic (PLEG): container finished" podID="696eed68-bf2d-4bbd-865f-07998d61f8ab" containerID="30c022eef87348aae4e9bdbc424e5f6c1baa0356ea65b1994f9191806ffd90dd" exitCode=0 Mar 22 00:11:23 crc kubenswrapper[5116]: I0322 00:11:23.607247 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kp7rb" event={"ID":"696eed68-bf2d-4bbd-865f-07998d61f8ab","Type":"ContainerDied","Data":"30c022eef87348aae4e9bdbc424e5f6c1baa0356ea65b1994f9191806ffd90dd"} Mar 22 00:11:23 crc kubenswrapper[5116]: I0322 00:11:23.721259 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1d2a94f-b4d4-4cdc-b862-a4866cadaea1" path="/var/lib/kubelet/pods/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1/volumes" Mar 22 00:11:23 crc kubenswrapper[5116]: I0322 00:11:23.847328 5116 ???:1] "http: TLS handshake error from 192.168.126.11:38976: no serving certificate available for the kubelet" Mar 22 00:11:24 crc kubenswrapper[5116]: I0322 00:11:24.617416 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbgnq" event={"ID":"29adb7c6-6fa5-4af7-9007-dc22cf4598e7","Type":"ContainerStarted","Data":"e92ff2f3629ac260811a785d8c6f8eb5ed939e4fb38c14eb7bff634a715c68a5"} Mar 22 00:11:24 crc kubenswrapper[5116]: I0322 00:11:24.621323 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrcmf" event={"ID":"77380b82-4c44-4cfd-a7b1-e77b060af507","Type":"ContainerStarted","Data":"58537fdac1ecd65bbab90d0d56679e194c7535455e823fc1086706266b101727"} Mar 22 00:11:24 crc kubenswrapper[5116]: I0322 00:11:24.624938 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4x6l" event={"ID":"da3b0eb3-e48f-4080-bfdc-522f18cf2876","Type":"ContainerStarted","Data":"fcd6dc8707ea620677d752dcbac5f99025670bc7fb912c816a48189214a21722"} Mar 22 00:11:24 crc kubenswrapper[5116]: I0322 00:11:24.627951 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-npbn6" event={"ID":"09fbdb0d-3da3-4d36-9a96-4ed0caa53799","Type":"ContainerStarted","Data":"d02b073538fa9eb3a597c60e947eebf2c3d5fa91c1caac83fb6bdc09fab1e354"} Mar 22 00:11:24 crc kubenswrapper[5116]: I0322 00:11:24.634540 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hppqv" event={"ID":"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70","Type":"ContainerStarted","Data":"85225f683a8985a92bc5f6a4df8ef21ff41480ec9e3be2c91f882ab5aa940b5f"} Mar 22 00:11:24 crc kubenswrapper[5116]: I0322 00:11:24.637959 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wlccm" event={"ID":"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e","Type":"ContainerStarted","Data":"27a93aeb60dad7dc0e6f4ee76954f9c95ea776bfea40c18f6141dcbc7dbb9e07"} Mar 22 00:11:24 crc kubenswrapper[5116]: I0322 00:11:24.640606 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kp7rb" event={"ID":"696eed68-bf2d-4bbd-865f-07998d61f8ab","Type":"ContainerStarted","Data":"c41be0d44abbf50de3e3033796a4335c2fb7b8a34b7a76e0caaf18024ae10985"} Mar 22 00:11:24 crc kubenswrapper[5116]: I0322 00:11:24.644412 5116 generic.go:358] "Generic (PLEG): container finished" podID="fe41a890-8a59-4fc7-b392-b7bab2ad5832" containerID="4615b1e7e8eebd1c4efa8ba0e4d690678c50216687ca96316a048717b240afa6" exitCode=0 Mar 22 00:11:24 crc kubenswrapper[5116]: I0322 00:11:24.644472 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wss9d" event={"ID":"fe41a890-8a59-4fc7-b392-b7bab2ad5832","Type":"ContainerDied","Data":"4615b1e7e8eebd1c4efa8ba0e4d690678c50216687ca96316a048717b240afa6"} Mar 22 00:11:24 crc kubenswrapper[5116]: I0322 00:11:24.668077 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-npbn6" podStartSLOduration=7.688369582 podStartE2EDuration="35.668052984s" podCreationTimestamp="2026-03-22 00:10:49 +0000 UTC" firstStartedPulling="2026-03-22 00:10:54.117970129 +0000 UTC m=+125.140271502" lastFinishedPulling="2026-03-22 00:11:22.097653511 +0000 UTC m=+153.119954904" observedRunningTime="2026-03-22 00:11:24.6647619 +0000 UTC m=+155.687063293" watchObservedRunningTime="2026-03-22 00:11:24.668052984 +0000 UTC m=+155.690354347" Mar 22 00:11:24 crc kubenswrapper[5116]: I0322 00:11:24.669673 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fbgnq" podStartSLOduration=6.622493156 podStartE2EDuration="32.669646935s" podCreationTimestamp="2026-03-22 00:10:52 +0000 UTC" firstStartedPulling="2026-03-22 00:10:56.053998923 +0000 UTC m=+127.076300286" lastFinishedPulling="2026-03-22 00:11:22.101152692 +0000 UTC m=+153.123454065" observedRunningTime="2026-03-22 00:11:24.643470587 +0000 UTC m=+155.665771970" watchObservedRunningTime="2026-03-22 00:11:24.669646935 +0000 UTC m=+155.691948308" Mar 22 00:11:24 crc kubenswrapper[5116]: I0322 00:11:24.683311 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wlccm" podStartSLOduration=6.799833974 podStartE2EDuration="33.683286856s" podCreationTimestamp="2026-03-22 00:10:51 +0000 UTC" firstStartedPulling="2026-03-22 00:10:54.958520081 +0000 UTC m=+125.980821444" lastFinishedPulling="2026-03-22 00:11:21.841972953 +0000 UTC m=+152.864274326" observedRunningTime="2026-03-22 00:11:24.682190051 +0000 UTC m=+155.704491424" watchObservedRunningTime="2026-03-22 00:11:24.683286856 +0000 UTC m=+155.705588229" Mar 22 00:11:24 crc kubenswrapper[5116]: I0322 00:11:24.746910 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kp7rb" podStartSLOduration=7.665904741 podStartE2EDuration="33.746894955s" podCreationTimestamp="2026-03-22 00:10:51 +0000 UTC" firstStartedPulling="2026-03-22 00:10:56.017484124 +0000 UTC m=+127.039785497" lastFinishedPulling="2026-03-22 00:11:22.098474318 +0000 UTC m=+153.120775711" observedRunningTime="2026-03-22 00:11:24.744655045 +0000 UTC m=+155.766956438" watchObservedRunningTime="2026-03-22 00:11:24.746894955 +0000 UTC m=+155.769196328" Mar 22 00:11:24 crc kubenswrapper[5116]: I0322 00:11:24.747338 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zrcmf" podStartSLOduration=6.3151667719999995 podStartE2EDuration="35.747331619s" podCreationTimestamp="2026-03-22 00:10:49 +0000 UTC" firstStartedPulling="2026-03-22 00:10:52.317468208 +0000 UTC m=+123.339769581" lastFinishedPulling="2026-03-22 00:11:21.749633045 +0000 UTC m=+152.771934428" observedRunningTime="2026-03-22 00:11:24.709951658 +0000 UTC m=+155.732253041" watchObservedRunningTime="2026-03-22 00:11:24.747331619 +0000 UTC m=+155.769632992" Mar 22 00:11:24 crc kubenswrapper[5116]: I0322 00:11:24.772388 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t4x6l" podStartSLOduration=6.24798832 podStartE2EDuration="35.77236138s" podCreationTimestamp="2026-03-22 00:10:49 +0000 UTC" firstStartedPulling="2026-03-22 00:10:52.317670385 +0000 UTC m=+123.339971768" lastFinishedPulling="2026-03-22 00:11:21.842043415 +0000 UTC m=+152.864344828" observedRunningTime="2026-03-22 00:11:24.770697607 +0000 UTC m=+155.792998990" watchObservedRunningTime="2026-03-22 00:11:24.77236138 +0000 UTC m=+155.794662753" Mar 22 00:11:24 crc kubenswrapper[5116]: I0322 00:11:24.797217 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hppqv" podStartSLOduration=6.234293697 podStartE2EDuration="35.797196625s" podCreationTimestamp="2026-03-22 00:10:49 +0000 UTC" firstStartedPulling="2026-03-22 00:10:52.316553979 +0000 UTC m=+123.338855352" lastFinishedPulling="2026-03-22 00:11:21.879456907 +0000 UTC m=+152.901758280" observedRunningTime="2026-03-22 00:11:24.79450112 +0000 UTC m=+155.816802503" watchObservedRunningTime="2026-03-22 00:11:24.797196625 +0000 UTC m=+155.819498008" Mar 22 00:11:25 crc kubenswrapper[5116]: I0322 00:11:25.653623 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wss9d" event={"ID":"fe41a890-8a59-4fc7-b392-b7bab2ad5832","Type":"ContainerStarted","Data":"a7f6a2282b3b24b75800d6fb74f2f5336f0fdfcc664abae1ceae752dc767bce9"} Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.641912 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wss9d" podStartSLOduration=8.53420339 podStartE2EDuration="34.641892965s" podCreationTimestamp="2026-03-22 00:10:52 +0000 UTC" firstStartedPulling="2026-03-22 00:10:56.029603999 +0000 UTC m=+127.051905372" lastFinishedPulling="2026-03-22 00:11:22.137293574 +0000 UTC m=+153.159594947" observedRunningTime="2026-03-22 00:11:25.682323074 +0000 UTC m=+156.704624447" watchObservedRunningTime="2026-03-22 00:11:26.641892965 +0000 UTC m=+157.664194338" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.643571 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-12-crc"] Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.644223 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1d2a94f-b4d4-4cdc-b862-a4866cadaea1" containerName="kube-multus-additional-cni-plugins" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.644244 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1d2a94f-b4d4-4cdc-b862-a4866cadaea1" containerName="kube-multus-additional-cni-plugins" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.644262 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="087356d8-050c-4861-8799-76df7a8330cb" containerName="pruner" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.644268 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="087356d8-050c-4861-8799-76df7a8330cb" containerName="pruner" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.644280 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1" containerName="collect-profiles" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.644287 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1" containerName="collect-profiles" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.644384 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="087356d8-050c-4861-8799-76df7a8330cb" containerName="pruner" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.644395 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1d2a94f-b4d4-4cdc-b862-a4866cadaea1" containerName="kube-multus-additional-cni-plugins" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.644408 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1" containerName="collect-profiles" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.652471 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-12-crc" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.654232 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-apiserver\"/\"kube-root-ca.crt\"" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.654279 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-apiserver\"/\"installer-sa-dockercfg-bqqnb\"" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.654357 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-12-crc"] Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.729132 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f36eaa2d-3ae4-4e89-991a-f7f42d317944-kubelet-dir\") pod \"revision-pruner-12-crc\" (UID: \"f36eaa2d-3ae4-4e89-991a-f7f42d317944\") " pod="openshift-kube-apiserver/revision-pruner-12-crc" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.729234 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f36eaa2d-3ae4-4e89-991a-f7f42d317944-kube-api-access\") pod \"revision-pruner-12-crc\" (UID: \"f36eaa2d-3ae4-4e89-991a-f7f42d317944\") " pod="openshift-kube-apiserver/revision-pruner-12-crc" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.830252 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f36eaa2d-3ae4-4e89-991a-f7f42d317944-kubelet-dir\") pod \"revision-pruner-12-crc\" (UID: \"f36eaa2d-3ae4-4e89-991a-f7f42d317944\") " pod="openshift-kube-apiserver/revision-pruner-12-crc" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.830542 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f36eaa2d-3ae4-4e89-991a-f7f42d317944-kube-api-access\") pod \"revision-pruner-12-crc\" (UID: \"f36eaa2d-3ae4-4e89-991a-f7f42d317944\") " pod="openshift-kube-apiserver/revision-pruner-12-crc" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.830996 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f36eaa2d-3ae4-4e89-991a-f7f42d317944-kubelet-dir\") pod \"revision-pruner-12-crc\" (UID: \"f36eaa2d-3ae4-4e89-991a-f7f42d317944\") " pod="openshift-kube-apiserver/revision-pruner-12-crc" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.859067 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f36eaa2d-3ae4-4e89-991a-f7f42d317944-kube-api-access\") pod \"revision-pruner-12-crc\" (UID: \"f36eaa2d-3ae4-4e89-991a-f7f42d317944\") " pod="openshift-kube-apiserver/revision-pruner-12-crc" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.971769 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-12-crc" Mar 22 00:11:27 crc kubenswrapper[5116]: I0322 00:11:27.087725 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:11:27 crc kubenswrapper[5116]: I0322 00:11:27.381233 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-12-crc"] Mar 22 00:11:27 crc kubenswrapper[5116]: W0322 00:11:27.387414 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf36eaa2d_3ae4_4e89_991a_f7f42d317944.slice/crio-9d0baae26ab41b01ca788cfd1236f141d1054c48d039a7574e2a9713b5c765ab WatchSource:0}: Error finding container 9d0baae26ab41b01ca788cfd1236f141d1054c48d039a7574e2a9713b5c765ab: Status 404 returned error can't find the container with id 9d0baae26ab41b01ca788cfd1236f141d1054c48d039a7574e2a9713b5c765ab Mar 22 00:11:27 crc kubenswrapper[5116]: I0322 00:11:27.667825 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-12-crc" event={"ID":"f36eaa2d-3ae4-4e89-991a-f7f42d317944","Type":"ContainerStarted","Data":"9d0baae26ab41b01ca788cfd1236f141d1054c48d039a7574e2a9713b5c765ab"} Mar 22 00:11:29 crc kubenswrapper[5116]: I0322 00:11:29.696360 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/certified-operators-t4x6l" Mar 22 00:11:29 crc kubenswrapper[5116]: I0322 00:11:29.707708 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t4x6l" Mar 22 00:11:29 crc kubenswrapper[5116]: I0322 00:11:29.816599 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zrcmf" Mar 22 00:11:29 crc kubenswrapper[5116]: I0322 00:11:29.817012 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/community-operators-zrcmf" Mar 22 00:11:29 crc kubenswrapper[5116]: I0322 00:11:29.959906 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t4x6l" Mar 22 00:11:29 crc kubenswrapper[5116]: I0322 00:11:29.963298 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zrcmf" Mar 22 00:11:30 crc kubenswrapper[5116]: I0322 00:11:30.160077 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hppqv" Mar 22 00:11:30 crc kubenswrapper[5116]: I0322 00:11:30.160534 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/certified-operators-hppqv" Mar 22 00:11:30 crc kubenswrapper[5116]: I0322 00:11:30.203018 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hppqv" Mar 22 00:11:30 crc kubenswrapper[5116]: I0322 00:11:30.278697 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-npbn6" Mar 22 00:11:30 crc kubenswrapper[5116]: I0322 00:11:30.278733 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/community-operators-npbn6" Mar 22 00:11:30 crc kubenswrapper[5116]: I0322 00:11:30.329689 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-npbn6" Mar 22 00:11:30 crc kubenswrapper[5116]: I0322 00:11:30.689178 5116 generic.go:358] "Generic (PLEG): container finished" podID="f36eaa2d-3ae4-4e89-991a-f7f42d317944" containerID="661758079b22658bb8c2b551471b833a0b0446a747ae3daba963976c194cf27a" exitCode=0 Mar 22 00:11:30 crc kubenswrapper[5116]: I0322 00:11:30.689256 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-12-crc" event={"ID":"f36eaa2d-3ae4-4e89-991a-f7f42d317944","Type":"ContainerDied","Data":"661758079b22658bb8c2b551471b833a0b0446a747ae3daba963976c194cf27a"} Mar 22 00:11:30 crc kubenswrapper[5116]: I0322 00:11:30.736368 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hppqv" Mar 22 00:11:30 crc kubenswrapper[5116]: I0322 00:11:30.736962 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zrcmf" Mar 22 00:11:30 crc kubenswrapper[5116]: I0322 00:11:30.745510 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t4x6l" Mar 22 00:11:30 crc kubenswrapper[5116]: I0322 00:11:30.745729 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-npbn6" Mar 22 00:11:31 crc kubenswrapper[5116]: I0322 00:11:31.909537 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-12-crc" Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.002035 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f36eaa2d-3ae4-4e89-991a-f7f42d317944-kubelet-dir\") pod \"f36eaa2d-3ae4-4e89-991a-f7f42d317944\" (UID: \"f36eaa2d-3ae4-4e89-991a-f7f42d317944\") " Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.002160 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f36eaa2d-3ae4-4e89-991a-f7f42d317944-kube-api-access\") pod \"f36eaa2d-3ae4-4e89-991a-f7f42d317944\" (UID: \"f36eaa2d-3ae4-4e89-991a-f7f42d317944\") " Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.002153 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f36eaa2d-3ae4-4e89-991a-f7f42d317944-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f36eaa2d-3ae4-4e89-991a-f7f42d317944" (UID: "f36eaa2d-3ae4-4e89-991a-f7f42d317944"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.002419 5116 reconciler_common.go:299] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f36eaa2d-3ae4-4e89-991a-f7f42d317944-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.010127 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f36eaa2d-3ae4-4e89-991a-f7f42d317944-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f36eaa2d-3ae4-4e89-991a-f7f42d317944" (UID: "f36eaa2d-3ae4-4e89-991a-f7f42d317944"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.076774 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hppqv"] Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.103243 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f36eaa2d-3ae4-4e89-991a-f7f42d317944-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.274115 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-npbn6"] Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.687601 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kp7rb" Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.687669 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-marketplace-kp7rb" Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.703526 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-12-crc" Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.703559 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-12-crc" event={"ID":"f36eaa2d-3ae4-4e89-991a-f7f42d317944","Type":"ContainerDied","Data":"9d0baae26ab41b01ca788cfd1236f141d1054c48d039a7574e2a9713b5c765ab"} Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.703599 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d0baae26ab41b01ca788cfd1236f141d1054c48d039a7574e2a9713b5c765ab" Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.703961 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hppqv" podUID="1c6b741d-fb0d-4bb1-a050-a2e56bd11e70" containerName="registry-server" containerID="cri-o://85225f683a8985a92bc5f6a4df8ef21ff41480ec9e3be2c91f882ab5aa940b5f" gracePeriod=2 Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.757935 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kp7rb" Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.790387 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-marketplace-wlccm" Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.790669 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wlccm" Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.812983 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kp7rb" Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.891565 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wss9d" Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.891626 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-operators-wss9d" Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.914288 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wlccm" Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.930977 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wss9d" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.575210 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hppqv" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.626897 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4dfx\" (UniqueName: \"kubernetes.io/projected/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-kube-api-access-c4dfx\") pod \"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70\" (UID: \"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70\") " Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.627009 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-utilities\") pod \"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70\" (UID: \"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70\") " Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.627074 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-catalog-content\") pod \"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70\" (UID: \"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70\") " Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.629332 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-utilities" (OuterVolumeSpecName: "utilities") pod "1c6b741d-fb0d-4bb1-a050-a2e56bd11e70" (UID: "1c6b741d-fb0d-4bb1-a050-a2e56bd11e70"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.637645 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-kube-api-access-c4dfx" (OuterVolumeSpecName: "kube-api-access-c4dfx") pod "1c6b741d-fb0d-4bb1-a050-a2e56bd11e70" (UID: "1c6b741d-fb0d-4bb1-a050-a2e56bd11e70"). InnerVolumeSpecName "kube-api-access-c4dfx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.669950 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1c6b741d-fb0d-4bb1-a050-a2e56bd11e70" (UID: "1c6b741d-fb0d-4bb1-a050-a2e56bd11e70"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.711405 5116 generic.go:358] "Generic (PLEG): container finished" podID="1c6b741d-fb0d-4bb1-a050-a2e56bd11e70" containerID="85225f683a8985a92bc5f6a4df8ef21ff41480ec9e3be2c91f882ab5aa940b5f" exitCode=0 Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.711489 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hppqv" event={"ID":"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70","Type":"ContainerDied","Data":"85225f683a8985a92bc5f6a4df8ef21ff41480ec9e3be2c91f882ab5aa940b5f"} Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.711543 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hppqv" event={"ID":"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70","Type":"ContainerDied","Data":"58966d6ce1821ab01be1615acae0ea7c0ed73c7caaf2ca09de37bd0dfaf8db3c"} Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.711544 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hppqv" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.711570 5116 scope.go:117] "RemoveContainer" containerID="85225f683a8985a92bc5f6a4df8ef21ff41480ec9e3be2c91f882ab5aa940b5f" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.712099 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/community-operators-npbn6" podUID="09fbdb0d-3da3-4d36-9a96-4ed0caa53799" containerName="registry-server" containerID="cri-o://d02b073538fa9eb3a597c60e947eebf2c3d5fa91c1caac83fb6bdc09fab1e354" gracePeriod=2 Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.728716 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.728781 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.728797 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c4dfx\" (UniqueName: \"kubernetes.io/projected/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-kube-api-access-c4dfx\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.735921 5116 scope.go:117] "RemoveContainer" containerID="96e41905d769f207cae9700db4a07b0da2c414c390460451815b728a092aab79" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.754410 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hppqv"] Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.758899 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wss9d" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.759255 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hppqv"] Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.766947 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wlccm" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.770855 5116 scope.go:117] "RemoveContainer" containerID="4ef7e2e5385256cc0a82e70e80e8f709c236aa9f928adb0c844bf2af996e6791" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.823734 5116 scope.go:117] "RemoveContainer" containerID="85225f683a8985a92bc5f6a4df8ef21ff41480ec9e3be2c91f882ab5aa940b5f" Mar 22 00:11:33 crc kubenswrapper[5116]: E0322 00:11:33.824232 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85225f683a8985a92bc5f6a4df8ef21ff41480ec9e3be2c91f882ab5aa940b5f\": container with ID starting with 85225f683a8985a92bc5f6a4df8ef21ff41480ec9e3be2c91f882ab5aa940b5f not found: ID does not exist" containerID="85225f683a8985a92bc5f6a4df8ef21ff41480ec9e3be2c91f882ab5aa940b5f" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.824267 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85225f683a8985a92bc5f6a4df8ef21ff41480ec9e3be2c91f882ab5aa940b5f"} err="failed to get container status \"85225f683a8985a92bc5f6a4df8ef21ff41480ec9e3be2c91f882ab5aa940b5f\": rpc error: code = NotFound desc = could not find container \"85225f683a8985a92bc5f6a4df8ef21ff41480ec9e3be2c91f882ab5aa940b5f\": container with ID starting with 85225f683a8985a92bc5f6a4df8ef21ff41480ec9e3be2c91f882ab5aa940b5f not found: ID does not exist" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.824312 5116 scope.go:117] "RemoveContainer" containerID="96e41905d769f207cae9700db4a07b0da2c414c390460451815b728a092aab79" Mar 22 00:11:33 crc kubenswrapper[5116]: E0322 00:11:33.824733 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96e41905d769f207cae9700db4a07b0da2c414c390460451815b728a092aab79\": container with ID starting with 96e41905d769f207cae9700db4a07b0da2c414c390460451815b728a092aab79 not found: ID does not exist" containerID="96e41905d769f207cae9700db4a07b0da2c414c390460451815b728a092aab79" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.824764 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96e41905d769f207cae9700db4a07b0da2c414c390460451815b728a092aab79"} err="failed to get container status \"96e41905d769f207cae9700db4a07b0da2c414c390460451815b728a092aab79\": rpc error: code = NotFound desc = could not find container \"96e41905d769f207cae9700db4a07b0da2c414c390460451815b728a092aab79\": container with ID starting with 96e41905d769f207cae9700db4a07b0da2c414c390460451815b728a092aab79 not found: ID does not exist" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.824780 5116 scope.go:117] "RemoveContainer" containerID="4ef7e2e5385256cc0a82e70e80e8f709c236aa9f928adb0c844bf2af996e6791" Mar 22 00:11:33 crc kubenswrapper[5116]: E0322 00:11:33.825058 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ef7e2e5385256cc0a82e70e80e8f709c236aa9f928adb0c844bf2af996e6791\": container with ID starting with 4ef7e2e5385256cc0a82e70e80e8f709c236aa9f928adb0c844bf2af996e6791 not found: ID does not exist" containerID="4ef7e2e5385256cc0a82e70e80e8f709c236aa9f928adb0c844bf2af996e6791" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.825088 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ef7e2e5385256cc0a82e70e80e8f709c236aa9f928adb0c844bf2af996e6791"} err="failed to get container status \"4ef7e2e5385256cc0a82e70e80e8f709c236aa9f928adb0c844bf2af996e6791\": rpc error: code = NotFound desc = could not find container \"4ef7e2e5385256cc0a82e70e80e8f709c236aa9f928adb0c844bf2af996e6791\": container with ID starting with 4ef7e2e5385256cc0a82e70e80e8f709c236aa9f928adb0c844bf2af996e6791 not found: ID does not exist" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.034681 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-npbn6" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.133134 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbjjc\" (UniqueName: \"kubernetes.io/projected/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-kube-api-access-dbjjc\") pod \"09fbdb0d-3da3-4d36-9a96-4ed0caa53799\" (UID: \"09fbdb0d-3da3-4d36-9a96-4ed0caa53799\") " Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.134042 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-utilities\") pod \"09fbdb0d-3da3-4d36-9a96-4ed0caa53799\" (UID: \"09fbdb0d-3da3-4d36-9a96-4ed0caa53799\") " Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.134094 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-catalog-content\") pod \"09fbdb0d-3da3-4d36-9a96-4ed0caa53799\" (UID: \"09fbdb0d-3da3-4d36-9a96-4ed0caa53799\") " Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.135764 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-utilities" (OuterVolumeSpecName: "utilities") pod "09fbdb0d-3da3-4d36-9a96-4ed0caa53799" (UID: "09fbdb0d-3da3-4d36-9a96-4ed0caa53799"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.139901 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-kube-api-access-dbjjc" (OuterVolumeSpecName: "kube-api-access-dbjjc") pod "09fbdb0d-3da3-4d36-9a96-4ed0caa53799" (UID: "09fbdb0d-3da3-4d36-9a96-4ed0caa53799"). InnerVolumeSpecName "kube-api-access-dbjjc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.218948 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "09fbdb0d-3da3-4d36-9a96-4ed0caa53799" (UID: "09fbdb0d-3da3-4d36-9a96-4ed0caa53799"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.236600 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dbjjc\" (UniqueName: \"kubernetes.io/projected/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-kube-api-access-dbjjc\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.236669 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.236682 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.441473 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-operators-fbgnq" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.441977 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fbgnq" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.496501 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fbgnq" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.722767 5116 generic.go:358] "Generic (PLEG): container finished" podID="09fbdb0d-3da3-4d36-9a96-4ed0caa53799" containerID="d02b073538fa9eb3a597c60e947eebf2c3d5fa91c1caac83fb6bdc09fab1e354" exitCode=0 Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.722846 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-npbn6" event={"ID":"09fbdb0d-3da3-4d36-9a96-4ed0caa53799","Type":"ContainerDied","Data":"d02b073538fa9eb3a597c60e947eebf2c3d5fa91c1caac83fb6bdc09fab1e354"} Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.722906 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-npbn6" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.722951 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-npbn6" event={"ID":"09fbdb0d-3da3-4d36-9a96-4ed0caa53799","Type":"ContainerDied","Data":"45e4dcc16acc5b874128c4be6959ee16a39761f0d8b86acfc6b789f6f799c066"} Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.722985 5116 scope.go:117] "RemoveContainer" containerID="d02b073538fa9eb3a597c60e947eebf2c3d5fa91c1caac83fb6bdc09fab1e354" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.745114 5116 scope.go:117] "RemoveContainer" containerID="1bec1f695abda26181894012664861851610963e95a85fd4d9deba481ae99299" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.757654 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-npbn6"] Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.762466 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-npbn6"] Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.779936 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fbgnq" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.794190 5116 scope.go:117] "RemoveContainer" containerID="bbda729440934768bb562f9328428c91354391de8fb49c00d7a6e4a343b96867" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.812988 5116 scope.go:117] "RemoveContainer" containerID="d02b073538fa9eb3a597c60e947eebf2c3d5fa91c1caac83fb6bdc09fab1e354" Mar 22 00:11:34 crc kubenswrapper[5116]: E0322 00:11:34.814562 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d02b073538fa9eb3a597c60e947eebf2c3d5fa91c1caac83fb6bdc09fab1e354\": container with ID starting with d02b073538fa9eb3a597c60e947eebf2c3d5fa91c1caac83fb6bdc09fab1e354 not found: ID does not exist" containerID="d02b073538fa9eb3a597c60e947eebf2c3d5fa91c1caac83fb6bdc09fab1e354" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.814605 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d02b073538fa9eb3a597c60e947eebf2c3d5fa91c1caac83fb6bdc09fab1e354"} err="failed to get container status \"d02b073538fa9eb3a597c60e947eebf2c3d5fa91c1caac83fb6bdc09fab1e354\": rpc error: code = NotFound desc = could not find container \"d02b073538fa9eb3a597c60e947eebf2c3d5fa91c1caac83fb6bdc09fab1e354\": container with ID starting with d02b073538fa9eb3a597c60e947eebf2c3d5fa91c1caac83fb6bdc09fab1e354 not found: ID does not exist" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.814630 5116 scope.go:117] "RemoveContainer" containerID="1bec1f695abda26181894012664861851610963e95a85fd4d9deba481ae99299" Mar 22 00:11:34 crc kubenswrapper[5116]: E0322 00:11:34.815652 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bec1f695abda26181894012664861851610963e95a85fd4d9deba481ae99299\": container with ID starting with 1bec1f695abda26181894012664861851610963e95a85fd4d9deba481ae99299 not found: ID does not exist" containerID="1bec1f695abda26181894012664861851610963e95a85fd4d9deba481ae99299" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.815706 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bec1f695abda26181894012664861851610963e95a85fd4d9deba481ae99299"} err="failed to get container status \"1bec1f695abda26181894012664861851610963e95a85fd4d9deba481ae99299\": rpc error: code = NotFound desc = could not find container \"1bec1f695abda26181894012664861851610963e95a85fd4d9deba481ae99299\": container with ID starting with 1bec1f695abda26181894012664861851610963e95a85fd4d9deba481ae99299 not found: ID does not exist" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.815739 5116 scope.go:117] "RemoveContainer" containerID="bbda729440934768bb562f9328428c91354391de8fb49c00d7a6e4a343b96867" Mar 22 00:11:34 crc kubenswrapper[5116]: E0322 00:11:34.816096 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbda729440934768bb562f9328428c91354391de8fb49c00d7a6e4a343b96867\": container with ID starting with bbda729440934768bb562f9328428c91354391de8fb49c00d7a6e4a343b96867 not found: ID does not exist" containerID="bbda729440934768bb562f9328428c91354391de8fb49c00d7a6e4a343b96867" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.816123 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbda729440934768bb562f9328428c91354391de8fb49c00d7a6e4a343b96867"} err="failed to get container status \"bbda729440934768bb562f9328428c91354391de8fb49c00d7a6e4a343b96867\": rpc error: code = NotFound desc = could not find container \"bbda729440934768bb562f9328428c91354391de8fb49c00d7a6e4a343b96867\": container with ID starting with bbda729440934768bb562f9328428c91354391de8fb49c00d7a6e4a343b96867 not found: ID does not exist" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.439724 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-12-crc"] Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.440331 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c6b741d-fb0d-4bb1-a050-a2e56bd11e70" containerName="registry-server" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.440344 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6b741d-fb0d-4bb1-a050-a2e56bd11e70" containerName="registry-server" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.440359 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f36eaa2d-3ae4-4e89-991a-f7f42d317944" containerName="pruner" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.440364 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="f36eaa2d-3ae4-4e89-991a-f7f42d317944" containerName="pruner" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.440373 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09fbdb0d-3da3-4d36-9a96-4ed0caa53799" containerName="extract-utilities" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.440380 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="09fbdb0d-3da3-4d36-9a96-4ed0caa53799" containerName="extract-utilities" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.440393 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c6b741d-fb0d-4bb1-a050-a2e56bd11e70" containerName="extract-utilities" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.440399 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6b741d-fb0d-4bb1-a050-a2e56bd11e70" containerName="extract-utilities" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.440408 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09fbdb0d-3da3-4d36-9a96-4ed0caa53799" containerName="extract-content" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.440413 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="09fbdb0d-3da3-4d36-9a96-4ed0caa53799" containerName="extract-content" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.440420 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09fbdb0d-3da3-4d36-9a96-4ed0caa53799" containerName="registry-server" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.440425 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="09fbdb0d-3da3-4d36-9a96-4ed0caa53799" containerName="registry-server" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.440439 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c6b741d-fb0d-4bb1-a050-a2e56bd11e70" containerName="extract-content" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.440444 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6b741d-fb0d-4bb1-a050-a2e56bd11e70" containerName="extract-content" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.440549 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="09fbdb0d-3da3-4d36-9a96-4ed0caa53799" containerName="registry-server" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.440562 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="1c6b741d-fb0d-4bb1-a050-a2e56bd11e70" containerName="registry-server" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.440573 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="f36eaa2d-3ae4-4e89-991a-f7f42d317944" containerName="pruner" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.446042 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-12-crc" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.454727 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-apiserver\"/\"kube-root-ca.crt\"" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.459593 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-apiserver\"/\"installer-sa-dockercfg-bqqnb\"" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.459947 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-12-crc"] Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.553946 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d04f6d8c-7814-4e1f-8000-afd2938eb5db-kube-api-access\") pod \"installer-12-crc\" (UID: \"d04f6d8c-7814-4e1f-8000-afd2938eb5db\") " pod="openshift-kube-apiserver/installer-12-crc" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.554004 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d04f6d8c-7814-4e1f-8000-afd2938eb5db-kubelet-dir\") pod \"installer-12-crc\" (UID: \"d04f6d8c-7814-4e1f-8000-afd2938eb5db\") " pod="openshift-kube-apiserver/installer-12-crc" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.554024 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d04f6d8c-7814-4e1f-8000-afd2938eb5db-var-lock\") pod \"installer-12-crc\" (UID: \"d04f6d8c-7814-4e1f-8000-afd2938eb5db\") " pod="openshift-kube-apiserver/installer-12-crc" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.655416 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d04f6d8c-7814-4e1f-8000-afd2938eb5db-kube-api-access\") pod \"installer-12-crc\" (UID: \"d04f6d8c-7814-4e1f-8000-afd2938eb5db\") " pod="openshift-kube-apiserver/installer-12-crc" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.655489 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d04f6d8c-7814-4e1f-8000-afd2938eb5db-kubelet-dir\") pod \"installer-12-crc\" (UID: \"d04f6d8c-7814-4e1f-8000-afd2938eb5db\") " pod="openshift-kube-apiserver/installer-12-crc" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.655530 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d04f6d8c-7814-4e1f-8000-afd2938eb5db-var-lock\") pod \"installer-12-crc\" (UID: \"d04f6d8c-7814-4e1f-8000-afd2938eb5db\") " pod="openshift-kube-apiserver/installer-12-crc" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.655566 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d04f6d8c-7814-4e1f-8000-afd2938eb5db-kubelet-dir\") pod \"installer-12-crc\" (UID: \"d04f6d8c-7814-4e1f-8000-afd2938eb5db\") " pod="openshift-kube-apiserver/installer-12-crc" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.655654 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d04f6d8c-7814-4e1f-8000-afd2938eb5db-var-lock\") pod \"installer-12-crc\" (UID: \"d04f6d8c-7814-4e1f-8000-afd2938eb5db\") " pod="openshift-kube-apiserver/installer-12-crc" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.680740 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d04f6d8c-7814-4e1f-8000-afd2938eb5db-kube-api-access\") pod \"installer-12-crc\" (UID: \"d04f6d8c-7814-4e1f-8000-afd2938eb5db\") " pod="openshift-kube-apiserver/installer-12-crc" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.704915 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09fbdb0d-3da3-4d36-9a96-4ed0caa53799" path="/var/lib/kubelet/pods/09fbdb0d-3da3-4d36-9a96-4ed0caa53799/volumes" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.705634 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c6b741d-fb0d-4bb1-a050-a2e56bd11e70" path="/var/lib/kubelet/pods/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70/volumes" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.763413 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-12-crc" Mar 22 00:11:36 crc kubenswrapper[5116]: I0322 00:11:36.197626 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-12-crc"] Mar 22 00:11:36 crc kubenswrapper[5116]: I0322 00:11:36.471162 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wlccm"] Mar 22 00:11:36 crc kubenswrapper[5116]: I0322 00:11:36.670135 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fbgnq"] Mar 22 00:11:36 crc kubenswrapper[5116]: I0322 00:11:36.740539 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-12-crc" event={"ID":"d04f6d8c-7814-4e1f-8000-afd2938eb5db","Type":"ContainerStarted","Data":"ee145f68231146608459069e1444e4a54019fcaddb68827cf1a4f7b9bb7e9250"} Mar 22 00:11:36 crc kubenswrapper[5116]: I0322 00:11:36.740584 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-12-crc" event={"ID":"d04f6d8c-7814-4e1f-8000-afd2938eb5db","Type":"ContainerStarted","Data":"6fa767d4004b7071ab450a3ff465069b85b54dadf7bad861245cb4344509496d"} Mar 22 00:11:36 crc kubenswrapper[5116]: I0322 00:11:36.740977 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wlccm" podUID="77aaaf4b-1fdf-4c49-9e45-86aabb6f007e" containerName="registry-server" containerID="cri-o://27a93aeb60dad7dc0e6f4ee76954f9c95ea776bfea40c18f6141dcbc7dbb9e07" gracePeriod=2 Mar 22 00:11:36 crc kubenswrapper[5116]: I0322 00:11:36.741053 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fbgnq" podUID="29adb7c6-6fa5-4af7-9007-dc22cf4598e7" containerName="registry-server" containerID="cri-o://e92ff2f3629ac260811a785d8c6f8eb5ed939e4fb38c14eb7bff634a715c68a5" gracePeriod=2 Mar 22 00:11:36 crc kubenswrapper[5116]: I0322 00:11:36.770577 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-12-crc" podStartSLOduration=1.7705591410000001 podStartE2EDuration="1.770559141s" podCreationTimestamp="2026-03-22 00:11:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:11:36.76860125 +0000 UTC m=+167.790902643" watchObservedRunningTime="2026-03-22 00:11:36.770559141 +0000 UTC m=+167.792860514" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.174003 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wlccm" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.184017 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbgnq" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.275958 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-utilities\") pod \"29adb7c6-6fa5-4af7-9007-dc22cf4598e7\" (UID: \"29adb7c6-6fa5-4af7-9007-dc22cf4598e7\") " Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.276116 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvgbs\" (UniqueName: \"kubernetes.io/projected/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-kube-api-access-rvgbs\") pod \"29adb7c6-6fa5-4af7-9007-dc22cf4598e7\" (UID: \"29adb7c6-6fa5-4af7-9007-dc22cf4598e7\") " Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.276212 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-catalog-content\") pod \"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e\" (UID: \"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e\") " Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.276281 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-utilities\") pod \"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e\" (UID: \"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e\") " Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.276313 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cksgt\" (UniqueName: \"kubernetes.io/projected/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-kube-api-access-cksgt\") pod \"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e\" (UID: \"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e\") " Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.276360 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-catalog-content\") pod \"29adb7c6-6fa5-4af7-9007-dc22cf4598e7\" (UID: \"29adb7c6-6fa5-4af7-9007-dc22cf4598e7\") " Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.277314 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-utilities" (OuterVolumeSpecName: "utilities") pod "77aaaf4b-1fdf-4c49-9e45-86aabb6f007e" (UID: "77aaaf4b-1fdf-4c49-9e45-86aabb6f007e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.277371 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-utilities" (OuterVolumeSpecName: "utilities") pod "29adb7c6-6fa5-4af7-9007-dc22cf4598e7" (UID: "29adb7c6-6fa5-4af7-9007-dc22cf4598e7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.285128 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-kube-api-access-rvgbs" (OuterVolumeSpecName: "kube-api-access-rvgbs") pod "29adb7c6-6fa5-4af7-9007-dc22cf4598e7" (UID: "29adb7c6-6fa5-4af7-9007-dc22cf4598e7"). InnerVolumeSpecName "kube-api-access-rvgbs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.285288 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-kube-api-access-cksgt" (OuterVolumeSpecName: "kube-api-access-cksgt") pod "77aaaf4b-1fdf-4c49-9e45-86aabb6f007e" (UID: "77aaaf4b-1fdf-4c49-9e45-86aabb6f007e"). InnerVolumeSpecName "kube-api-access-cksgt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.297946 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "77aaaf4b-1fdf-4c49-9e45-86aabb6f007e" (UID: "77aaaf4b-1fdf-4c49-9e45-86aabb6f007e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.377941 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.377985 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rvgbs\" (UniqueName: \"kubernetes.io/projected/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-kube-api-access-rvgbs\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.377999 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.378012 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.378023 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cksgt\" (UniqueName: \"kubernetes.io/projected/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-kube-api-access-cksgt\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.410072 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "29adb7c6-6fa5-4af7-9007-dc22cf4598e7" (UID: "29adb7c6-6fa5-4af7-9007-dc22cf4598e7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.479350 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.749828 5116 generic.go:358] "Generic (PLEG): container finished" podID="77aaaf4b-1fdf-4c49-9e45-86aabb6f007e" containerID="27a93aeb60dad7dc0e6f4ee76954f9c95ea776bfea40c18f6141dcbc7dbb9e07" exitCode=0 Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.749938 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wlccm" event={"ID":"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e","Type":"ContainerDied","Data":"27a93aeb60dad7dc0e6f4ee76954f9c95ea776bfea40c18f6141dcbc7dbb9e07"} Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.750150 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wlccm" event={"ID":"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e","Type":"ContainerDied","Data":"96be2eef25c44ecc457a8a1fa10ad35be205dda004792bd0cccdb43de654bdc4"} Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.749987 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wlccm" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.750189 5116 scope.go:117] "RemoveContainer" containerID="27a93aeb60dad7dc0e6f4ee76954f9c95ea776bfea40c18f6141dcbc7dbb9e07" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.761430 5116 generic.go:358] "Generic (PLEG): container finished" podID="29adb7c6-6fa5-4af7-9007-dc22cf4598e7" containerID="e92ff2f3629ac260811a785d8c6f8eb5ed939e4fb38c14eb7bff634a715c68a5" exitCode=0 Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.762384 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbgnq" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.762538 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbgnq" event={"ID":"29adb7c6-6fa5-4af7-9007-dc22cf4598e7","Type":"ContainerDied","Data":"e92ff2f3629ac260811a785d8c6f8eb5ed939e4fb38c14eb7bff634a715c68a5"} Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.762573 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbgnq" event={"ID":"29adb7c6-6fa5-4af7-9007-dc22cf4598e7","Type":"ContainerDied","Data":"5618b991bbeac41e6785299c47466f86c4e18d2882dd3d7064face6121177c93"} Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.780752 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wlccm"] Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.781273 5116 scope.go:117] "RemoveContainer" containerID="9a57353f19b072378c022259ab5cd42c5b1dc1b97fbc286360c406f3e67d341a" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.788276 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wlccm"] Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.798669 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fbgnq"] Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.799985 5116 scope.go:117] "RemoveContainer" containerID="8c0f75cea0a92631a5a45b839d9a2769abea056ad176859d98334cc88e86bfd5" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.803018 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fbgnq"] Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.817650 5116 scope.go:117] "RemoveContainer" containerID="27a93aeb60dad7dc0e6f4ee76954f9c95ea776bfea40c18f6141dcbc7dbb9e07" Mar 22 00:11:37 crc kubenswrapper[5116]: E0322 00:11:37.818152 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27a93aeb60dad7dc0e6f4ee76954f9c95ea776bfea40c18f6141dcbc7dbb9e07\": container with ID starting with 27a93aeb60dad7dc0e6f4ee76954f9c95ea776bfea40c18f6141dcbc7dbb9e07 not found: ID does not exist" containerID="27a93aeb60dad7dc0e6f4ee76954f9c95ea776bfea40c18f6141dcbc7dbb9e07" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.818211 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27a93aeb60dad7dc0e6f4ee76954f9c95ea776bfea40c18f6141dcbc7dbb9e07"} err="failed to get container status \"27a93aeb60dad7dc0e6f4ee76954f9c95ea776bfea40c18f6141dcbc7dbb9e07\": rpc error: code = NotFound desc = could not find container \"27a93aeb60dad7dc0e6f4ee76954f9c95ea776bfea40c18f6141dcbc7dbb9e07\": container with ID starting with 27a93aeb60dad7dc0e6f4ee76954f9c95ea776bfea40c18f6141dcbc7dbb9e07 not found: ID does not exist" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.818239 5116 scope.go:117] "RemoveContainer" containerID="9a57353f19b072378c022259ab5cd42c5b1dc1b97fbc286360c406f3e67d341a" Mar 22 00:11:37 crc kubenswrapper[5116]: E0322 00:11:37.818550 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a57353f19b072378c022259ab5cd42c5b1dc1b97fbc286360c406f3e67d341a\": container with ID starting with 9a57353f19b072378c022259ab5cd42c5b1dc1b97fbc286360c406f3e67d341a not found: ID does not exist" containerID="9a57353f19b072378c022259ab5cd42c5b1dc1b97fbc286360c406f3e67d341a" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.818568 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a57353f19b072378c022259ab5cd42c5b1dc1b97fbc286360c406f3e67d341a"} err="failed to get container status \"9a57353f19b072378c022259ab5cd42c5b1dc1b97fbc286360c406f3e67d341a\": rpc error: code = NotFound desc = could not find container \"9a57353f19b072378c022259ab5cd42c5b1dc1b97fbc286360c406f3e67d341a\": container with ID starting with 9a57353f19b072378c022259ab5cd42c5b1dc1b97fbc286360c406f3e67d341a not found: ID does not exist" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.818579 5116 scope.go:117] "RemoveContainer" containerID="8c0f75cea0a92631a5a45b839d9a2769abea056ad176859d98334cc88e86bfd5" Mar 22 00:11:37 crc kubenswrapper[5116]: E0322 00:11:37.818874 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c0f75cea0a92631a5a45b839d9a2769abea056ad176859d98334cc88e86bfd5\": container with ID starting with 8c0f75cea0a92631a5a45b839d9a2769abea056ad176859d98334cc88e86bfd5 not found: ID does not exist" containerID="8c0f75cea0a92631a5a45b839d9a2769abea056ad176859d98334cc88e86bfd5" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.818932 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c0f75cea0a92631a5a45b839d9a2769abea056ad176859d98334cc88e86bfd5"} err="failed to get container status \"8c0f75cea0a92631a5a45b839d9a2769abea056ad176859d98334cc88e86bfd5\": rpc error: code = NotFound desc = could not find container \"8c0f75cea0a92631a5a45b839d9a2769abea056ad176859d98334cc88e86bfd5\": container with ID starting with 8c0f75cea0a92631a5a45b839d9a2769abea056ad176859d98334cc88e86bfd5 not found: ID does not exist" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.818967 5116 scope.go:117] "RemoveContainer" containerID="e92ff2f3629ac260811a785d8c6f8eb5ed939e4fb38c14eb7bff634a715c68a5" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.833992 5116 scope.go:117] "RemoveContainer" containerID="b89e7acd7f87121f1cf68c7619f0385299e7a83f180ceb344d324886b920069e" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.849467 5116 scope.go:117] "RemoveContainer" containerID="95e390561446d32b93dd2b4ee1d879e784f5ecc936c45e238f849f636fbfb5fe" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.884930 5116 scope.go:117] "RemoveContainer" containerID="e92ff2f3629ac260811a785d8c6f8eb5ed939e4fb38c14eb7bff634a715c68a5" Mar 22 00:11:37 crc kubenswrapper[5116]: E0322 00:11:37.886202 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e92ff2f3629ac260811a785d8c6f8eb5ed939e4fb38c14eb7bff634a715c68a5\": container with ID starting with e92ff2f3629ac260811a785d8c6f8eb5ed939e4fb38c14eb7bff634a715c68a5 not found: ID does not exist" containerID="e92ff2f3629ac260811a785d8c6f8eb5ed939e4fb38c14eb7bff634a715c68a5" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.886330 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e92ff2f3629ac260811a785d8c6f8eb5ed939e4fb38c14eb7bff634a715c68a5"} err="failed to get container status \"e92ff2f3629ac260811a785d8c6f8eb5ed939e4fb38c14eb7bff634a715c68a5\": rpc error: code = NotFound desc = could not find container \"e92ff2f3629ac260811a785d8c6f8eb5ed939e4fb38c14eb7bff634a715c68a5\": container with ID starting with e92ff2f3629ac260811a785d8c6f8eb5ed939e4fb38c14eb7bff634a715c68a5 not found: ID does not exist" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.886423 5116 scope.go:117] "RemoveContainer" containerID="b89e7acd7f87121f1cf68c7619f0385299e7a83f180ceb344d324886b920069e" Mar 22 00:11:37 crc kubenswrapper[5116]: E0322 00:11:37.888608 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b89e7acd7f87121f1cf68c7619f0385299e7a83f180ceb344d324886b920069e\": container with ID starting with b89e7acd7f87121f1cf68c7619f0385299e7a83f180ceb344d324886b920069e not found: ID does not exist" containerID="b89e7acd7f87121f1cf68c7619f0385299e7a83f180ceb344d324886b920069e" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.888670 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b89e7acd7f87121f1cf68c7619f0385299e7a83f180ceb344d324886b920069e"} err="failed to get container status \"b89e7acd7f87121f1cf68c7619f0385299e7a83f180ceb344d324886b920069e\": rpc error: code = NotFound desc = could not find container \"b89e7acd7f87121f1cf68c7619f0385299e7a83f180ceb344d324886b920069e\": container with ID starting with b89e7acd7f87121f1cf68c7619f0385299e7a83f180ceb344d324886b920069e not found: ID does not exist" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.888710 5116 scope.go:117] "RemoveContainer" containerID="95e390561446d32b93dd2b4ee1d879e784f5ecc936c45e238f849f636fbfb5fe" Mar 22 00:11:37 crc kubenswrapper[5116]: E0322 00:11:37.889080 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95e390561446d32b93dd2b4ee1d879e784f5ecc936c45e238f849f636fbfb5fe\": container with ID starting with 95e390561446d32b93dd2b4ee1d879e784f5ecc936c45e238f849f636fbfb5fe not found: ID does not exist" containerID="95e390561446d32b93dd2b4ee1d879e784f5ecc936c45e238f849f636fbfb5fe" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.889108 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95e390561446d32b93dd2b4ee1d879e784f5ecc936c45e238f849f636fbfb5fe"} err="failed to get container status \"95e390561446d32b93dd2b4ee1d879e784f5ecc936c45e238f849f636fbfb5fe\": rpc error: code = NotFound desc = could not find container \"95e390561446d32b93dd2b4ee1d879e784f5ecc936c45e238f849f636fbfb5fe\": container with ID starting with 95e390561446d32b93dd2b4ee1d879e784f5ecc936c45e238f849f636fbfb5fe not found: ID does not exist" Mar 22 00:11:39 crc kubenswrapper[5116]: I0322 00:11:39.706469 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29adb7c6-6fa5-4af7-9007-dc22cf4598e7" path="/var/lib/kubelet/pods/29adb7c6-6fa5-4af7-9007-dc22cf4598e7/volumes" Mar 22 00:11:39 crc kubenswrapper[5116]: I0322 00:11:39.709396 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77aaaf4b-1fdf-4c49-9e45-86aabb6f007e" path="/var/lib/kubelet/pods/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e/volumes" Mar 22 00:11:59 crc kubenswrapper[5116]: I0322 00:11:59.615122 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-66458b6674-8qfhd"] Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.134419 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29568972-5s86m"] Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.136949 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77aaaf4b-1fdf-4c49-9e45-86aabb6f007e" containerName="extract-utilities" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.137006 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="77aaaf4b-1fdf-4c49-9e45-86aabb6f007e" containerName="extract-utilities" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.137069 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29adb7c6-6fa5-4af7-9007-dc22cf4598e7" containerName="extract-utilities" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.137087 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="29adb7c6-6fa5-4af7-9007-dc22cf4598e7" containerName="extract-utilities" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.137125 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77aaaf4b-1fdf-4c49-9e45-86aabb6f007e" containerName="extract-content" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.137142 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="77aaaf4b-1fdf-4c49-9e45-86aabb6f007e" containerName="extract-content" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.137224 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29adb7c6-6fa5-4af7-9007-dc22cf4598e7" containerName="extract-content" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.137243 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="29adb7c6-6fa5-4af7-9007-dc22cf4598e7" containerName="extract-content" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.137260 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77aaaf4b-1fdf-4c49-9e45-86aabb6f007e" containerName="registry-server" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.137276 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="77aaaf4b-1fdf-4c49-9e45-86aabb6f007e" containerName="registry-server" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.137299 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29adb7c6-6fa5-4af7-9007-dc22cf4598e7" containerName="registry-server" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.137318 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="29adb7c6-6fa5-4af7-9007-dc22cf4598e7" containerName="registry-server" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.137558 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="29adb7c6-6fa5-4af7-9007-dc22cf4598e7" containerName="registry-server" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.137585 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="77aaaf4b-1fdf-4c49-9e45-86aabb6f007e" containerName="registry-server" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.156504 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568972-5s86m"] Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.156657 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568972-5s86m" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.162765 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.162858 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-zsw2q\"" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.162772 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.268898 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvfbc\" (UniqueName: \"kubernetes.io/projected/907ec022-a4e4-4d33-8329-52c9bbb71520-kube-api-access-jvfbc\") pod \"auto-csr-approver-29568972-5s86m\" (UID: \"907ec022-a4e4-4d33-8329-52c9bbb71520\") " pod="openshift-infra/auto-csr-approver-29568972-5s86m" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.370539 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jvfbc\" (UniqueName: \"kubernetes.io/projected/907ec022-a4e4-4d33-8329-52c9bbb71520-kube-api-access-jvfbc\") pod \"auto-csr-approver-29568972-5s86m\" (UID: \"907ec022-a4e4-4d33-8329-52c9bbb71520\") " pod="openshift-infra/auto-csr-approver-29568972-5s86m" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.408464 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvfbc\" (UniqueName: \"kubernetes.io/projected/907ec022-a4e4-4d33-8329-52c9bbb71520-kube-api-access-jvfbc\") pod \"auto-csr-approver-29568972-5s86m\" (UID: \"907ec022-a4e4-4d33-8329-52c9bbb71520\") " pod="openshift-infra/auto-csr-approver-29568972-5s86m" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.476005 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568972-5s86m" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.851252 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568972-5s86m"] Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.894100 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568972-5s86m" event={"ID":"907ec022-a4e4-4d33-8329-52c9bbb71520","Type":"ContainerStarted","Data":"de25deff2abb4d7c9ae4cfaa6d6a15c6e45609e7a4b4386bae474e620d2b3211"} Mar 22 00:12:04 crc kubenswrapper[5116]: I0322 00:12:04.330727 5116 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-h678z" Mar 22 00:12:04 crc kubenswrapper[5116]: I0322 00:12:04.339132 5116 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-h678z" Mar 22 00:12:04 crc kubenswrapper[5116]: I0322 00:12:04.914641 5116 generic.go:358] "Generic (PLEG): container finished" podID="907ec022-a4e4-4d33-8329-52c9bbb71520" containerID="5f050f299176f9b417ea910e3bb8affec9c6d4bf35a6de76d0aa5ed0d88ddf0f" exitCode=0 Mar 22 00:12:04 crc kubenswrapper[5116]: I0322 00:12:04.914686 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568972-5s86m" event={"ID":"907ec022-a4e4-4d33-8329-52c9bbb71520","Type":"ContainerDied","Data":"5f050f299176f9b417ea910e3bb8affec9c6d4bf35a6de76d0aa5ed0d88ddf0f"} Mar 22 00:12:05 crc kubenswrapper[5116]: I0322 00:12:05.340299 5116 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2026-04-21 00:07:04 +0000 UTC" deadline="2026-04-16 04:03:48.043582248 +0000 UTC" Mar 22 00:12:05 crc kubenswrapper[5116]: I0322 00:12:05.340343 5116 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="603h51m42.703243438s" Mar 22 00:12:06 crc kubenswrapper[5116]: I0322 00:12:06.213087 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568972-5s86m" Mar 22 00:12:06 crc kubenswrapper[5116]: I0322 00:12:06.291702 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvfbc\" (UniqueName: \"kubernetes.io/projected/907ec022-a4e4-4d33-8329-52c9bbb71520-kube-api-access-jvfbc\") pod \"907ec022-a4e4-4d33-8329-52c9bbb71520\" (UID: \"907ec022-a4e4-4d33-8329-52c9bbb71520\") " Mar 22 00:12:06 crc kubenswrapper[5116]: I0322 00:12:06.298522 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/907ec022-a4e4-4d33-8329-52c9bbb71520-kube-api-access-jvfbc" (OuterVolumeSpecName: "kube-api-access-jvfbc") pod "907ec022-a4e4-4d33-8329-52c9bbb71520" (UID: "907ec022-a4e4-4d33-8329-52c9bbb71520"). InnerVolumeSpecName "kube-api-access-jvfbc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:12:06 crc kubenswrapper[5116]: I0322 00:12:06.340660 5116 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2026-04-21 00:07:04 +0000 UTC" deadline="2026-04-16 02:01:17.200055935 +0000 UTC" Mar 22 00:12:06 crc kubenswrapper[5116]: I0322 00:12:06.341044 5116 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="601h49m10.859016373s" Mar 22 00:12:06 crc kubenswrapper[5116]: I0322 00:12:06.393285 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jvfbc\" (UniqueName: \"kubernetes.io/projected/907ec022-a4e4-4d33-8329-52c9bbb71520-kube-api-access-jvfbc\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:06 crc kubenswrapper[5116]: I0322 00:12:06.928223 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568972-5s86m" event={"ID":"907ec022-a4e4-4d33-8329-52c9bbb71520","Type":"ContainerDied","Data":"de25deff2abb4d7c9ae4cfaa6d6a15c6e45609e7a4b4386bae474e620d2b3211"} Mar 22 00:12:06 crc kubenswrapper[5116]: I0322 00:12:06.928283 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de25deff2abb4d7c9ae4cfaa6d6a15c6e45609e7a4b4386bae474e620d2b3211" Mar 22 00:12:06 crc kubenswrapper[5116]: I0322 00:12:06.928282 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568972-5s86m" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.320885 5116 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.322895 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="907ec022-a4e4-4d33-8329-52c9bbb71520" containerName="oc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.322926 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="907ec022-a4e4-4d33-8329-52c9bbb71520" containerName="oc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.323159 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="907ec022-a4e4-4d33-8329-52c9bbb71520" containerName="oc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.327896 5116 kubelet.go:2547] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.327948 5116 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.328194 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329195 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver" containerID="cri-o://ad6827aa53ec071d573f2851cceb4fb83950ed3acf710fdb8da0db4f5143d5e7" gracePeriod=15 Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329248 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://8b5eb8790cbf9c748b6973a9f5ce75637e41f035ed3bdd5eda970498f8d57bdb" gracePeriod=15 Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329370 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-cert-syncer" containerID="cri-o://d4b3378494f7debf7a1df1eb70ba9f5b687fd897b206a12e1eb127da23e5830b" gracePeriod=15 Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329191 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://b24be2af5c7a78bb1d324b802332ff3620ee459e00164b6574221c5186689456" gracePeriod=15 Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329210 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" containerID="cri-o://dcfc5090822842b425def56d8dcf3225bb8000eb1ec79e8329dcadd2f3879a0f" gracePeriod=15 Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329609 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-cert-regeneration-controller" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329649 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-cert-regeneration-controller" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329679 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-insecure-readyz" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329696 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-insecure-readyz" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329776 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329809 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329833 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329848 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329869 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329885 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329912 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329929 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329959 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="setup" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329974 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="setup" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329992 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-cert-syncer" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.330007 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-cert-syncer" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.330033 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.330047 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.330304 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.330333 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.330351 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-cert-regeneration-controller" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.330376 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.330397 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.330416 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-insecure-readyz" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.330433 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-cert-syncer" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.330701 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.330724 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.330985 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.331496 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.336850 5116 status_manager.go:905] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="3a14caf222afb62aaabdc47808b6f944" podUID="57755cc5f99000cc11e193051474d4e2" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.350431 5116 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.369084 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.406191 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.406257 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-bundle-dir\" (UniqueName: \"kubernetes.io/empty-dir/57755cc5f99000cc11e193051474d4e2-ca-bundle-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.406414 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.406464 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.406528 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.406590 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.406658 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.406681 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.406699 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/57755cc5f99000cc11e193051474d4e2-tmp-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.406713 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.507954 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508010 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508041 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508078 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508153 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508155 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508199 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508155 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508237 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508288 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508282 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508329 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/57755cc5f99000cc11e193051474d4e2-tmp-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508351 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508355 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508449 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508501 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508559 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508590 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-bundle-dir\" (UniqueName: \"kubernetes.io/empty-dir/57755cc5f99000cc11e193051474d4e2-ca-bundle-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508771 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/57755cc5f99000cc11e193051474d4e2-tmp-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508916 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-bundle-dir\" (UniqueName: \"kubernetes.io/empty-dir/57755cc5f99000cc11e193051474d4e2-ca-bundle-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.666370 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: E0322 00:12:14.693977 5116 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.223:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189f016a65c5b829 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f7dbc7e1ee9c187a863ef9b473fad27b,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:12:14.692882473 +0000 UTC m=+205.715183896,LastTimestamp:2026-03-22 00:12:14.692882473 +0000 UTC m=+205.715183896,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.016849 5116 generic.go:358] "Generic (PLEG): container finished" podID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" containerID="1d4ffcdcf7f3c1ceaea89d564eaa99a17f05170fe4b5407ce1655d672a0e5a4e" exitCode=0 Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.016953 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29568960-tjk88" event={"ID":"f8b3ade2-2521-43a2-a5fc-2c33d19f3a58","Type":"ContainerDied","Data":"1d4ffcdcf7f3c1ceaea89d564eaa99a17f05170fe4b5407ce1655d672a0e5a4e"} Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.018028 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.018328 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.019569 5116 generic.go:358] "Generic (PLEG): container finished" podID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" containerID="ee145f68231146608459069e1444e4a54019fcaddb68827cf1a4f7b9bb7e9250" exitCode=0 Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.019663 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-12-crc" event={"ID":"d04f6d8c-7814-4e1f-8000-afd2938eb5db","Type":"ContainerDied","Data":"ee145f68231146608459069e1444e4a54019fcaddb68827cf1a4f7b9bb7e9250"} Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.020052 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.020277 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.020542 5116 status_manager.go:895] "Failed to get status for pod" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.022802 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/3.log" Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.024266 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-cert-syncer/0.log" Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.025260 5116 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="dcfc5090822842b425def56d8dcf3225bb8000eb1ec79e8329dcadd2f3879a0f" exitCode=0 Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.025290 5116 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="b24be2af5c7a78bb1d324b802332ff3620ee459e00164b6574221c5186689456" exitCode=0 Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.025299 5116 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="8b5eb8790cbf9c748b6973a9f5ce75637e41f035ed3bdd5eda970498f8d57bdb" exitCode=0 Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.025307 5116 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="d4b3378494f7debf7a1df1eb70ba9f5b687fd897b206a12e1eb127da23e5830b" exitCode=2 Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.025357 5116 scope.go:117] "RemoveContainer" containerID="4ec1f0e4053fa1e136a94ad86e588cc0fd43b29333734b120fe3d6175c1913a8" Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.027010 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f7dbc7e1ee9c187a863ef9b473fad27b","Type":"ContainerStarted","Data":"5cb08176408e0f1bc306dc6b2afd3ce9b0c721a85dbb42d8ee9b826aa3eecff5"} Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.027034 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f7dbc7e1ee9c187a863ef9b473fad27b","Type":"ContainerStarted","Data":"e89942e237048983deb4673cd81f4ae670d784cef4073c8536922b7c8b0579fa"} Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.027600 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.027997 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.028347 5116 status_manager.go:895] "Failed to get status for pod" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.039290 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-cert-syncer/0.log" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.359947 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-12-crc" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.360951 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.361257 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.361510 5116 status_manager.go:895] "Failed to get status for pod" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.365598 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29568960-tjk88" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.366065 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.366329 5116 status_manager.go:895] "Failed to get status for pod" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.366704 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.434807 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d04f6d8c-7814-4e1f-8000-afd2938eb5db-var-lock\") pod \"d04f6d8c-7814-4e1f-8000-afd2938eb5db\" (UID: \"d04f6d8c-7814-4e1f-8000-afd2938eb5db\") " Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.434900 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d04f6d8c-7814-4e1f-8000-afd2938eb5db-kube-api-access\") pod \"d04f6d8c-7814-4e1f-8000-afd2938eb5db\" (UID: \"d04f6d8c-7814-4e1f-8000-afd2938eb5db\") " Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.434986 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f8b3ade2-2521-43a2-a5fc-2c33d19f3a58-serviceca\") pod \"f8b3ade2-2521-43a2-a5fc-2c33d19f3a58\" (UID: \"f8b3ade2-2521-43a2-a5fc-2c33d19f3a58\") " Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.435098 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d04f6d8c-7814-4e1f-8000-afd2938eb5db-kubelet-dir\") pod \"d04f6d8c-7814-4e1f-8000-afd2938eb5db\" (UID: \"d04f6d8c-7814-4e1f-8000-afd2938eb5db\") " Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.435215 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9lrd\" (UniqueName: \"kubernetes.io/projected/f8b3ade2-2521-43a2-a5fc-2c33d19f3a58-kube-api-access-b9lrd\") pod \"f8b3ade2-2521-43a2-a5fc-2c33d19f3a58\" (UID: \"f8b3ade2-2521-43a2-a5fc-2c33d19f3a58\") " Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.438439 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d04f6d8c-7814-4e1f-8000-afd2938eb5db-var-lock" (OuterVolumeSpecName: "var-lock") pod "d04f6d8c-7814-4e1f-8000-afd2938eb5db" (UID: "d04f6d8c-7814-4e1f-8000-afd2938eb5db"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.438550 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d04f6d8c-7814-4e1f-8000-afd2938eb5db-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d04f6d8c-7814-4e1f-8000-afd2938eb5db" (UID: "d04f6d8c-7814-4e1f-8000-afd2938eb5db"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.439248 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8b3ade2-2521-43a2-a5fc-2c33d19f3a58-serviceca" (OuterVolumeSpecName: "serviceca") pod "f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" (UID: "f8b3ade2-2521-43a2-a5fc-2c33d19f3a58"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.442634 5116 reconciler_common.go:299] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d04f6d8c-7814-4e1f-8000-afd2938eb5db-var-lock\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.442699 5116 reconciler_common.go:299] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f8b3ade2-2521-43a2-a5fc-2c33d19f3a58-serviceca\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.442728 5116 reconciler_common.go:299] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d04f6d8c-7814-4e1f-8000-afd2938eb5db-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.460440 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d04f6d8c-7814-4e1f-8000-afd2938eb5db-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d04f6d8c-7814-4e1f-8000-afd2938eb5db" (UID: "d04f6d8c-7814-4e1f-8000-afd2938eb5db"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.462369 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8b3ade2-2521-43a2-a5fc-2c33d19f3a58-kube-api-access-b9lrd" (OuterVolumeSpecName: "kube-api-access-b9lrd") pod "f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" (UID: "f8b3ade2-2521-43a2-a5fc-2c33d19f3a58"). InnerVolumeSpecName "kube-api-access-b9lrd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.543502 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b9lrd\" (UniqueName: \"kubernetes.io/projected/f8b3ade2-2521-43a2-a5fc-2c33d19f3a58-kube-api-access-b9lrd\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.543543 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d04f6d8c-7814-4e1f-8000-afd2938eb5db-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.727293 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-cert-syncer/0.log" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.727879 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.728424 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.728645 5116 status_manager.go:895] "Failed to get status for pod" podUID="3a14caf222afb62aaabdc47808b6f944" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.728920 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.729330 5116 status_manager.go:895] "Failed to get status for pod" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.847298 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-tmp-dir\") pod \"3a14caf222afb62aaabdc47808b6f944\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.847405 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-bundle-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-ca-bundle-dir\") pod \"3a14caf222afb62aaabdc47808b6f944\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.847470 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-audit-dir\") pod \"3a14caf222afb62aaabdc47808b6f944\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.847521 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-resource-dir\") pod \"3a14caf222afb62aaabdc47808b6f944\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.847559 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-cert-dir\") pod \"3a14caf222afb62aaabdc47808b6f944\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.847619 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "3a14caf222afb62aaabdc47808b6f944" (UID: "3a14caf222afb62aaabdc47808b6f944"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.847738 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "3a14caf222afb62aaabdc47808b6f944" (UID: "3a14caf222afb62aaabdc47808b6f944"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.847950 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-ca-bundle-dir" (OuterVolumeSpecName: "ca-bundle-dir") pod "3a14caf222afb62aaabdc47808b6f944" (UID: "3a14caf222afb62aaabdc47808b6f944"). InnerVolumeSpecName "ca-bundle-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.847600 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "3a14caf222afb62aaabdc47808b6f944" (UID: "3a14caf222afb62aaabdc47808b6f944"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.848133 5116 reconciler_common.go:299] "Volume detached for volume \"ca-bundle-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-ca-bundle-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.848146 5116 reconciler_common.go:299] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.848155 5116 reconciler_common.go:299] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.848183 5116 reconciler_common.go:299] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.850634 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "3a14caf222afb62aaabdc47808b6f944" (UID: "3a14caf222afb62aaabdc47808b6f944"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.949382 5116 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-tmp-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.047822 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29568960-tjk88" event={"ID":"f8b3ade2-2521-43a2-a5fc-2c33d19f3a58","Type":"ContainerDied","Data":"82ad6b215452bb6301a7576bc4e5f0e0d906844913073e724aa03d477079324c"} Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.047871 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82ad6b215452bb6301a7576bc4e5f0e0d906844913073e724aa03d477079324c" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.047988 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29568960-tjk88" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.051580 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-12-crc" event={"ID":"d04f6d8c-7814-4e1f-8000-afd2938eb5db","Type":"ContainerDied","Data":"6fa767d4004b7071ab450a3ff465069b85b54dadf7bad861245cb4344509496d"} Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.051643 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fa767d4004b7071ab450a3ff465069b85b54dadf7bad861245cb4344509496d" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.051649 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-12-crc" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.054141 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-cert-syncer/0.log" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.055017 5116 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="ad6827aa53ec071d573f2851cceb4fb83950ed3acf710fdb8da0db4f5143d5e7" exitCode=0 Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.055118 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.055124 5116 scope.go:117] "RemoveContainer" containerID="dcfc5090822842b425def56d8dcf3225bb8000eb1ec79e8329dcadd2f3879a0f" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.064747 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.065144 5116 status_manager.go:895] "Failed to get status for pod" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.065594 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.065869 5116 status_manager.go:895] "Failed to get status for pod" podUID="3a14caf222afb62aaabdc47808b6f944" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.068869 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.069762 5116 status_manager.go:895] "Failed to get status for pod" podUID="3a14caf222afb62aaabdc47808b6f944" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.070073 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.070391 5116 status_manager.go:895] "Failed to get status for pod" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.070854 5116 scope.go:117] "RemoveContainer" containerID="b24be2af5c7a78bb1d324b802332ff3620ee459e00164b6574221c5186689456" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.072338 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.072661 5116 status_manager.go:895] "Failed to get status for pod" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.073024 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.073311 5116 status_manager.go:895] "Failed to get status for pod" podUID="3a14caf222afb62aaabdc47808b6f944" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.089354 5116 scope.go:117] "RemoveContainer" containerID="8b5eb8790cbf9c748b6973a9f5ce75637e41f035ed3bdd5eda970498f8d57bdb" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.108961 5116 scope.go:117] "RemoveContainer" containerID="d4b3378494f7debf7a1df1eb70ba9f5b687fd897b206a12e1eb127da23e5830b" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.123533 5116 scope.go:117] "RemoveContainer" containerID="ad6827aa53ec071d573f2851cceb4fb83950ed3acf710fdb8da0db4f5143d5e7" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.145086 5116 scope.go:117] "RemoveContainer" containerID="15a3fa2ea4a685791c1b819448d6d0952ea97b7caf1a51b4250746e17e743cc5" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.206928 5116 scope.go:117] "RemoveContainer" containerID="dcfc5090822842b425def56d8dcf3225bb8000eb1ec79e8329dcadd2f3879a0f" Mar 22 00:12:17 crc kubenswrapper[5116]: E0322 00:12:17.207831 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcfc5090822842b425def56d8dcf3225bb8000eb1ec79e8329dcadd2f3879a0f\": container with ID starting with dcfc5090822842b425def56d8dcf3225bb8000eb1ec79e8329dcadd2f3879a0f not found: ID does not exist" containerID="dcfc5090822842b425def56d8dcf3225bb8000eb1ec79e8329dcadd2f3879a0f" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.207875 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcfc5090822842b425def56d8dcf3225bb8000eb1ec79e8329dcadd2f3879a0f"} err="failed to get container status \"dcfc5090822842b425def56d8dcf3225bb8000eb1ec79e8329dcadd2f3879a0f\": rpc error: code = NotFound desc = could not find container \"dcfc5090822842b425def56d8dcf3225bb8000eb1ec79e8329dcadd2f3879a0f\": container with ID starting with dcfc5090822842b425def56d8dcf3225bb8000eb1ec79e8329dcadd2f3879a0f not found: ID does not exist" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.207901 5116 scope.go:117] "RemoveContainer" containerID="b24be2af5c7a78bb1d324b802332ff3620ee459e00164b6574221c5186689456" Mar 22 00:12:17 crc kubenswrapper[5116]: E0322 00:12:17.208243 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b24be2af5c7a78bb1d324b802332ff3620ee459e00164b6574221c5186689456\": container with ID starting with b24be2af5c7a78bb1d324b802332ff3620ee459e00164b6574221c5186689456 not found: ID does not exist" containerID="b24be2af5c7a78bb1d324b802332ff3620ee459e00164b6574221c5186689456" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.208266 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b24be2af5c7a78bb1d324b802332ff3620ee459e00164b6574221c5186689456"} err="failed to get container status \"b24be2af5c7a78bb1d324b802332ff3620ee459e00164b6574221c5186689456\": rpc error: code = NotFound desc = could not find container \"b24be2af5c7a78bb1d324b802332ff3620ee459e00164b6574221c5186689456\": container with ID starting with b24be2af5c7a78bb1d324b802332ff3620ee459e00164b6574221c5186689456 not found: ID does not exist" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.208283 5116 scope.go:117] "RemoveContainer" containerID="8b5eb8790cbf9c748b6973a9f5ce75637e41f035ed3bdd5eda970498f8d57bdb" Mar 22 00:12:17 crc kubenswrapper[5116]: E0322 00:12:17.208791 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b5eb8790cbf9c748b6973a9f5ce75637e41f035ed3bdd5eda970498f8d57bdb\": container with ID starting with 8b5eb8790cbf9c748b6973a9f5ce75637e41f035ed3bdd5eda970498f8d57bdb not found: ID does not exist" containerID="8b5eb8790cbf9c748b6973a9f5ce75637e41f035ed3bdd5eda970498f8d57bdb" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.208838 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b5eb8790cbf9c748b6973a9f5ce75637e41f035ed3bdd5eda970498f8d57bdb"} err="failed to get container status \"8b5eb8790cbf9c748b6973a9f5ce75637e41f035ed3bdd5eda970498f8d57bdb\": rpc error: code = NotFound desc = could not find container \"8b5eb8790cbf9c748b6973a9f5ce75637e41f035ed3bdd5eda970498f8d57bdb\": container with ID starting with 8b5eb8790cbf9c748b6973a9f5ce75637e41f035ed3bdd5eda970498f8d57bdb not found: ID does not exist" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.208867 5116 scope.go:117] "RemoveContainer" containerID="d4b3378494f7debf7a1df1eb70ba9f5b687fd897b206a12e1eb127da23e5830b" Mar 22 00:12:17 crc kubenswrapper[5116]: E0322 00:12:17.209120 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4b3378494f7debf7a1df1eb70ba9f5b687fd897b206a12e1eb127da23e5830b\": container with ID starting with d4b3378494f7debf7a1df1eb70ba9f5b687fd897b206a12e1eb127da23e5830b not found: ID does not exist" containerID="d4b3378494f7debf7a1df1eb70ba9f5b687fd897b206a12e1eb127da23e5830b" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.209160 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4b3378494f7debf7a1df1eb70ba9f5b687fd897b206a12e1eb127da23e5830b"} err="failed to get container status \"d4b3378494f7debf7a1df1eb70ba9f5b687fd897b206a12e1eb127da23e5830b\": rpc error: code = NotFound desc = could not find container \"d4b3378494f7debf7a1df1eb70ba9f5b687fd897b206a12e1eb127da23e5830b\": container with ID starting with d4b3378494f7debf7a1df1eb70ba9f5b687fd897b206a12e1eb127da23e5830b not found: ID does not exist" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.209208 5116 scope.go:117] "RemoveContainer" containerID="ad6827aa53ec071d573f2851cceb4fb83950ed3acf710fdb8da0db4f5143d5e7" Mar 22 00:12:17 crc kubenswrapper[5116]: E0322 00:12:17.209494 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad6827aa53ec071d573f2851cceb4fb83950ed3acf710fdb8da0db4f5143d5e7\": container with ID starting with ad6827aa53ec071d573f2851cceb4fb83950ed3acf710fdb8da0db4f5143d5e7 not found: ID does not exist" containerID="ad6827aa53ec071d573f2851cceb4fb83950ed3acf710fdb8da0db4f5143d5e7" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.209520 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad6827aa53ec071d573f2851cceb4fb83950ed3acf710fdb8da0db4f5143d5e7"} err="failed to get container status \"ad6827aa53ec071d573f2851cceb4fb83950ed3acf710fdb8da0db4f5143d5e7\": rpc error: code = NotFound desc = could not find container \"ad6827aa53ec071d573f2851cceb4fb83950ed3acf710fdb8da0db4f5143d5e7\": container with ID starting with ad6827aa53ec071d573f2851cceb4fb83950ed3acf710fdb8da0db4f5143d5e7 not found: ID does not exist" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.209536 5116 scope.go:117] "RemoveContainer" containerID="15a3fa2ea4a685791c1b819448d6d0952ea97b7caf1a51b4250746e17e743cc5" Mar 22 00:12:17 crc kubenswrapper[5116]: E0322 00:12:17.209794 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15a3fa2ea4a685791c1b819448d6d0952ea97b7caf1a51b4250746e17e743cc5\": container with ID starting with 15a3fa2ea4a685791c1b819448d6d0952ea97b7caf1a51b4250746e17e743cc5 not found: ID does not exist" containerID="15a3fa2ea4a685791c1b819448d6d0952ea97b7caf1a51b4250746e17e743cc5" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.209874 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15a3fa2ea4a685791c1b819448d6d0952ea97b7caf1a51b4250746e17e743cc5"} err="failed to get container status \"15a3fa2ea4a685791c1b819448d6d0952ea97b7caf1a51b4250746e17e743cc5\": rpc error: code = NotFound desc = could not find container \"15a3fa2ea4a685791c1b819448d6d0952ea97b7caf1a51b4250746e17e743cc5\": container with ID starting with 15a3fa2ea4a685791c1b819448d6d0952ea97b7caf1a51b4250746e17e743cc5 not found: ID does not exist" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.707402 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a14caf222afb62aaabdc47808b6f944" path="/var/lib/kubelet/pods/3a14caf222afb62aaabdc47808b6f944/volumes" Mar 22 00:12:19 crc kubenswrapper[5116]: I0322 00:12:19.703950 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:19 crc kubenswrapper[5116]: I0322 00:12:19.705813 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:19 crc kubenswrapper[5116]: I0322 00:12:19.706574 5116 status_manager.go:895] "Failed to get status for pod" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:21 crc kubenswrapper[5116]: E0322 00:12:21.762520 5116 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.223:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189f016a65c5b829 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f7dbc7e1ee9c187a863ef9b473fad27b,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:12:14.692882473 +0000 UTC m=+205.715183896,LastTimestamp:2026-03-22 00:12:14.692882473 +0000 UTC m=+205.715183896,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:12:23 crc kubenswrapper[5116]: I0322 00:12:23.056780 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:12:23 crc kubenswrapper[5116]: I0322 00:12:23.056855 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:12:23 crc kubenswrapper[5116]: E0322 00:12:23.497823 5116 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:23 crc kubenswrapper[5116]: E0322 00:12:23.498261 5116 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:23 crc kubenswrapper[5116]: E0322 00:12:23.498750 5116 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:23 crc kubenswrapper[5116]: E0322 00:12:23.499095 5116 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:23 crc kubenswrapper[5116]: E0322 00:12:23.499475 5116 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:23 crc kubenswrapper[5116]: I0322 00:12:23.499512 5116 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 22 00:12:23 crc kubenswrapper[5116]: E0322 00:12:23.499809 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="200ms" Mar 22 00:12:23 crc kubenswrapper[5116]: E0322 00:12:23.700503 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="400ms" Mar 22 00:12:24 crc kubenswrapper[5116]: E0322 00:12:24.101534 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="800ms" Mar 22 00:12:24 crc kubenswrapper[5116]: I0322 00:12:24.644396 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" podUID="73ebea9b-fc7b-4d54-af53-f6f61e0fce97" containerName="oauth-openshift" containerID="cri-o://0eaa89fec503d9ea89bbf6737645ec15fbfea7b3c5aaafe399fe76d10fe522f5" gracePeriod=15 Mar 22 00:12:24 crc kubenswrapper[5116]: E0322 00:12:24.903477 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="1.6s" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.107745 5116 generic.go:358] "Generic (PLEG): container finished" podID="73ebea9b-fc7b-4d54-af53-f6f61e0fce97" containerID="0eaa89fec503d9ea89bbf6737645ec15fbfea7b3c5aaafe399fe76d10fe522f5" exitCode=0 Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.107860 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" event={"ID":"73ebea9b-fc7b-4d54-af53-f6f61e0fce97","Type":"ContainerDied","Data":"0eaa89fec503d9ea89bbf6737645ec15fbfea7b3c5aaafe399fe76d10fe522f5"} Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.108364 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" event={"ID":"73ebea9b-fc7b-4d54-af53-f6f61e0fce97","Type":"ContainerDied","Data":"541bc11e49e0b922c8da02fa30a7c9193a344bc13bc63367f3137d8c058690fb"} Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.108386 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="541bc11e49e0b922c8da02fa30a7c9193a344bc13bc63367f3137d8c058690fb" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.129397 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.129960 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.130563 5116 status_manager.go:895] "Failed to get status for pod" podUID="73ebea9b-fc7b-4d54-af53-f6f61e0fce97" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-66458b6674-8qfhd\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.130827 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.131105 5116 status_manager.go:895] "Failed to get status for pod" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.256971 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-audit-policies\") pod \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.257063 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-trusted-ca-bundle\") pod \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.257115 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-service-ca\") pod \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.257197 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qljht\" (UniqueName: \"kubernetes.io/projected/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-kube-api-access-qljht\") pod \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.257231 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-login\") pod \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.257312 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-ocp-branding-template\") pod \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.257357 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-error\") pod \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.257392 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-idp-0-file-data\") pod \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.257457 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-router-certs\") pod \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.257542 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-audit-dir\") pod \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.257584 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-session\") pod \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.257627 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-provider-selection\") pod \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.257667 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-serving-cert\") pod \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.257717 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-cliconfig\") pod \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.258361 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "73ebea9b-fc7b-4d54-af53-f6f61e0fce97" (UID: "73ebea9b-fc7b-4d54-af53-f6f61e0fce97"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.258807 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "73ebea9b-fc7b-4d54-af53-f6f61e0fce97" (UID: "73ebea9b-fc7b-4d54-af53-f6f61e0fce97"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.259075 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "73ebea9b-fc7b-4d54-af53-f6f61e0fce97" (UID: "73ebea9b-fc7b-4d54-af53-f6f61e0fce97"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.259311 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "73ebea9b-fc7b-4d54-af53-f6f61e0fce97" (UID: "73ebea9b-fc7b-4d54-af53-f6f61e0fce97"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.259609 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "73ebea9b-fc7b-4d54-af53-f6f61e0fce97" (UID: "73ebea9b-fc7b-4d54-af53-f6f61e0fce97"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.264765 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "73ebea9b-fc7b-4d54-af53-f6f61e0fce97" (UID: "73ebea9b-fc7b-4d54-af53-f6f61e0fce97"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.265202 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "73ebea9b-fc7b-4d54-af53-f6f61e0fce97" (UID: "73ebea9b-fc7b-4d54-af53-f6f61e0fce97"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.265520 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "73ebea9b-fc7b-4d54-af53-f6f61e0fce97" (UID: "73ebea9b-fc7b-4d54-af53-f6f61e0fce97"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.268514 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-kube-api-access-qljht" (OuterVolumeSpecName: "kube-api-access-qljht") pod "73ebea9b-fc7b-4d54-af53-f6f61e0fce97" (UID: "73ebea9b-fc7b-4d54-af53-f6f61e0fce97"). InnerVolumeSpecName "kube-api-access-qljht". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.268596 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "73ebea9b-fc7b-4d54-af53-f6f61e0fce97" (UID: "73ebea9b-fc7b-4d54-af53-f6f61e0fce97"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.268767 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "73ebea9b-fc7b-4d54-af53-f6f61e0fce97" (UID: "73ebea9b-fc7b-4d54-af53-f6f61e0fce97"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.269183 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "73ebea9b-fc7b-4d54-af53-f6f61e0fce97" (UID: "73ebea9b-fc7b-4d54-af53-f6f61e0fce97"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.269599 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "73ebea9b-fc7b-4d54-af53-f6f61e0fce97" (UID: "73ebea9b-fc7b-4d54-af53-f6f61e0fce97"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.270901 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "73ebea9b-fc7b-4d54-af53-f6f61e0fce97" (UID: "73ebea9b-fc7b-4d54-af53-f6f61e0fce97"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.359846 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.359969 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.360032 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qljht\" (UniqueName: \"kubernetes.io/projected/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-kube-api-access-qljht\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.360052 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.360070 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.360089 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.360106 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.360129 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.360158 5116 reconciler_common.go:299] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.360216 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.360299 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.360330 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.360355 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.360373 5116 reconciler_common.go:299] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:26 crc kubenswrapper[5116]: I0322 00:12:26.116631 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:12:26 crc kubenswrapper[5116]: I0322 00:12:26.118042 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:26 crc kubenswrapper[5116]: I0322 00:12:26.118632 5116 status_manager.go:895] "Failed to get status for pod" podUID="73ebea9b-fc7b-4d54-af53-f6f61e0fce97" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-66458b6674-8qfhd\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:26 crc kubenswrapper[5116]: I0322 00:12:26.119115 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:26 crc kubenswrapper[5116]: I0322 00:12:26.119965 5116 status_manager.go:895] "Failed to get status for pod" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:26 crc kubenswrapper[5116]: I0322 00:12:26.122015 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:26 crc kubenswrapper[5116]: I0322 00:12:26.122441 5116 status_manager.go:895] "Failed to get status for pod" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:26 crc kubenswrapper[5116]: I0322 00:12:26.122771 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:26 crc kubenswrapper[5116]: I0322 00:12:26.123127 5116 status_manager.go:895] "Failed to get status for pod" podUID="73ebea9b-fc7b-4d54-af53-f6f61e0fce97" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-66458b6674-8qfhd\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:26 crc kubenswrapper[5116]: E0322 00:12:26.504789 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="3.2s" Mar 22 00:12:28 crc kubenswrapper[5116]: I0322 00:12:28.133737 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 22 00:12:28 crc kubenswrapper[5116]: I0322 00:12:28.134077 5116 generic.go:358] "Generic (PLEG): container finished" podID="9f0bc7fcb0822a2c13eb2d22cd8c0641" containerID="05d6ab41bff1491b16d13fd321fc3f2ed79784b5945966cfa37735f570281b54" exitCode=1 Mar 22 00:12:28 crc kubenswrapper[5116]: I0322 00:12:28.134233 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"9f0bc7fcb0822a2c13eb2d22cd8c0641","Type":"ContainerDied","Data":"05d6ab41bff1491b16d13fd321fc3f2ed79784b5945966cfa37735f570281b54"} Mar 22 00:12:28 crc kubenswrapper[5116]: I0322 00:12:28.135967 5116 status_manager.go:895] "Failed to get status for pod" podUID="9f0bc7fcb0822a2c13eb2d22cd8c0641" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:28 crc kubenswrapper[5116]: I0322 00:12:28.136011 5116 scope.go:117] "RemoveContainer" containerID="05d6ab41bff1491b16d13fd321fc3f2ed79784b5945966cfa37735f570281b54" Mar 22 00:12:28 crc kubenswrapper[5116]: I0322 00:12:28.136538 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:28 crc kubenswrapper[5116]: I0322 00:12:28.137305 5116 status_manager.go:895] "Failed to get status for pod" podUID="73ebea9b-fc7b-4d54-af53-f6f61e0fce97" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-66458b6674-8qfhd\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:28 crc kubenswrapper[5116]: I0322 00:12:28.137845 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:28 crc kubenswrapper[5116]: I0322 00:12:28.138287 5116 status_manager.go:895] "Failed to get status for pod" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.148727 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.148901 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"9f0bc7fcb0822a2c13eb2d22cd8c0641","Type":"ContainerStarted","Data":"dad18ac6eecf9b080f9abca1e33533420e48da065dc1002c74302a487f52aacf"} Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.150260 5116 status_manager.go:895] "Failed to get status for pod" podUID="73ebea9b-fc7b-4d54-af53-f6f61e0fce97" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-66458b6674-8qfhd\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.150881 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.151420 5116 status_manager.go:895] "Failed to get status for pod" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.151982 5116 status_manager.go:895] "Failed to get status for pod" podUID="9f0bc7fcb0822a2c13eb2d22cd8c0641" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.152739 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.705219 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.705501 5116 status_manager.go:895] "Failed to get status for pod" podUID="73ebea9b-fc7b-4d54-af53-f6f61e0fce97" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-66458b6674-8qfhd\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.706305 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:29 crc kubenswrapper[5116]: E0322 00:12:29.706501 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="6.4s" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.706758 5116 status_manager.go:895] "Failed to get status for pod" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.707310 5116 status_manager.go:895] "Failed to get status for pod" podUID="9f0bc7fcb0822a2c13eb2d22cd8c0641" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.707936 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.708253 5116 status_manager.go:895] "Failed to get status for pod" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.708595 5116 status_manager.go:895] "Failed to get status for pod" podUID="9f0bc7fcb0822a2c13eb2d22cd8c0641" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.709092 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.709630 5116 status_manager.go:895] "Failed to get status for pod" podUID="73ebea9b-fc7b-4d54-af53-f6f61e0fce97" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-66458b6674-8qfhd\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:29 crc kubenswrapper[5116]: E0322 00:12:29.709753 5116 desired_state_of_world_populator.go:305] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.223:6443: connect: connection refused" pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" volumeName="registry-storage" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.710122 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.730805 5116 kubelet.go:3323] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="80bbc05d-2ba5-48f3-8d94-3fcd0c0f12d9" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.730840 5116 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="80bbc05d-2ba5-48f3-8d94-3fcd0c0f12d9" Mar 22 00:12:29 crc kubenswrapper[5116]: E0322 00:12:29.731402 5116 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.732594 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:30 crc kubenswrapper[5116]: I0322 00:12:30.173753 5116 generic.go:358] "Generic (PLEG): container finished" podID="57755cc5f99000cc11e193051474d4e2" containerID="b32f7d41ffa88138d0582a1450f0b71884223fd0ff939e59c91365f086a779e1" exitCode=0 Mar 22 00:12:30 crc kubenswrapper[5116]: I0322 00:12:30.173888 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"57755cc5f99000cc11e193051474d4e2","Type":"ContainerDied","Data":"b32f7d41ffa88138d0582a1450f0b71884223fd0ff939e59c91365f086a779e1"} Mar 22 00:12:30 crc kubenswrapper[5116]: I0322 00:12:30.174131 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"57755cc5f99000cc11e193051474d4e2","Type":"ContainerStarted","Data":"d1bc653fcd32c7727edd873aec58d794e017d63a5b2b4ef85fd0d5342b71f652"} Mar 22 00:12:30 crc kubenswrapper[5116]: I0322 00:12:30.174818 5116 kubelet.go:3323] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="80bbc05d-2ba5-48f3-8d94-3fcd0c0f12d9" Mar 22 00:12:30 crc kubenswrapper[5116]: I0322 00:12:30.174855 5116 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="80bbc05d-2ba5-48f3-8d94-3fcd0c0f12d9" Mar 22 00:12:30 crc kubenswrapper[5116]: I0322 00:12:30.175571 5116 status_manager.go:895] "Failed to get status for pod" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:30 crc kubenswrapper[5116]: E0322 00:12:30.175582 5116 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:30 crc kubenswrapper[5116]: I0322 00:12:30.176031 5116 status_manager.go:895] "Failed to get status for pod" podUID="9f0bc7fcb0822a2c13eb2d22cd8c0641" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:30 crc kubenswrapper[5116]: I0322 00:12:30.176531 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:30 crc kubenswrapper[5116]: I0322 00:12:30.176901 5116 status_manager.go:895] "Failed to get status for pod" podUID="73ebea9b-fc7b-4d54-af53-f6f61e0fce97" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-66458b6674-8qfhd\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:30 crc kubenswrapper[5116]: I0322 00:12:30.177477 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:30 crc kubenswrapper[5116]: E0322 00:12:30.447038 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:12:30Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:12:30Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:12:30Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:12:30Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:d349aa1ac1aeef5b43252e81842c9fdae910d60790027b9e872b5662ac4b78a3\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:fe773df04da144155e88592785e3f887dd2518380c36be5aa0d5f3c4b407346b\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1738635128},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:97ea275ea42f6b1f7f4041dea668f5ec2ba23ebc254008d276dc9589f8fa2899\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:a48320aa1bff5d245dea9a04c9c7f038669d74be18d254ab16f1b1071c6efe02\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1278267500},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:743309461ecf5761903ae65e667afffe78054c5286dd469edd6aebdbc1266545\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:8334109ecba91dfe3e738d0b6bb46e4263ca25d0f5a99832a61371f9ea33f2fa\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1239891053},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:2b25b4ab3e224e729bcb897a9d8b4500cb8cdf41dc4e39241fca36503dd7a6e6\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:7010a1d34012ae242b0950c830b00b3a9907b1dc17951db92c5e0d4a06d6d3a1\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1183656546},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:30 crc kubenswrapper[5116]: E0322 00:12:30.447577 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:30 crc kubenswrapper[5116]: E0322 00:12:30.447891 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:30 crc kubenswrapper[5116]: E0322 00:12:30.448212 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:30 crc kubenswrapper[5116]: E0322 00:12:30.448585 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:30 crc kubenswrapper[5116]: E0322 00:12:30.448616 5116 kubelet_node_status.go:584] "Unable to update node status" err="update node status exceeds retry count" Mar 22 00:12:31 crc kubenswrapper[5116]: I0322 00:12:31.183090 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"57755cc5f99000cc11e193051474d4e2","Type":"ContainerStarted","Data":"c051c9ab63505541bf23433f5ec8f77a814dab2ddf493328097b48d9e97dcdf0"} Mar 22 00:12:31 crc kubenswrapper[5116]: I0322 00:12:31.183142 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"57755cc5f99000cc11e193051474d4e2","Type":"ContainerStarted","Data":"130990b4dc827738704ae59ea639bd7b0fa3f03691dcc3c370a1fa1d424322f9"} Mar 22 00:12:31 crc kubenswrapper[5116]: I0322 00:12:31.183159 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"57755cc5f99000cc11e193051474d4e2","Type":"ContainerStarted","Data":"80c741d698dd97e3f8f7308d04234de45eac4bdc3ba94d035f805eac0180fc33"} Mar 22 00:12:31 crc kubenswrapper[5116]: I0322 00:12:31.183185 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"57755cc5f99000cc11e193051474d4e2","Type":"ContainerStarted","Data":"212ceaec8f8d88c0485faa66cfe2cba85614269d326bff6250fec302122a2e21"} Mar 22 00:12:31 crc kubenswrapper[5116]: I0322 00:12:31.704309 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:12:31 crc kubenswrapper[5116]: I0322 00:12:31.709821 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:12:31 crc kubenswrapper[5116]: I0322 00:12:31.788972 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:12:32 crc kubenswrapper[5116]: I0322 00:12:32.191298 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"57755cc5f99000cc11e193051474d4e2","Type":"ContainerStarted","Data":"44ab5aef330a7ac5d594e79eb92f23b9cb2e8c72a76793adbf837829a8ec347e"} Mar 22 00:12:32 crc kubenswrapper[5116]: I0322 00:12:32.191482 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:32 crc kubenswrapper[5116]: I0322 00:12:32.191638 5116 kubelet.go:3323] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="80bbc05d-2ba5-48f3-8d94-3fcd0c0f12d9" Mar 22 00:12:32 crc kubenswrapper[5116]: I0322 00:12:32.191670 5116 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="80bbc05d-2ba5-48f3-8d94-3fcd0c0f12d9" Mar 22 00:12:34 crc kubenswrapper[5116]: I0322 00:12:34.732757 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:34 crc kubenswrapper[5116]: I0322 00:12:34.733058 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:34 crc kubenswrapper[5116]: I0322 00:12:34.738365 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:37 crc kubenswrapper[5116]: I0322 00:12:37.413667 5116 kubelet.go:3329] "Deleted mirror pod as it didn't match the static Pod" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:37 crc kubenswrapper[5116]: I0322 00:12:37.414110 5116 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:38 crc kubenswrapper[5116]: I0322 00:12:38.232548 5116 kubelet.go:3323] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="80bbc05d-2ba5-48f3-8d94-3fcd0c0f12d9" Mar 22 00:12:38 crc kubenswrapper[5116]: I0322 00:12:38.232581 5116 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="80bbc05d-2ba5-48f3-8d94-3fcd0c0f12d9" Mar 22 00:12:38 crc kubenswrapper[5116]: I0322 00:12:38.237858 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:39 crc kubenswrapper[5116]: I0322 00:12:39.238240 5116 kubelet.go:3323] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="80bbc05d-2ba5-48f3-8d94-3fcd0c0f12d9" Mar 22 00:12:39 crc kubenswrapper[5116]: I0322 00:12:39.238269 5116 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="80bbc05d-2ba5-48f3-8d94-3fcd0c0f12d9" Mar 22 00:12:39 crc kubenswrapper[5116]: I0322 00:12:39.743614 5116 status_manager.go:905] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="57755cc5f99000cc11e193051474d4e2" podUID="896372db-5d6a-4ee9-a4a2-ad50e6a194a1" Mar 22 00:12:43 crc kubenswrapper[5116]: I0322 00:12:43.201868 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:12:47 crc kubenswrapper[5116]: I0322 00:12:47.989342 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Mar 22 00:12:48 crc kubenswrapper[5116]: I0322 00:12:48.132398 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Mar 22 00:12:48 crc kubenswrapper[5116]: I0322 00:12:48.254907 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-controller-manager-operator\"/\"kube-controller-manager-operator-dockercfg-tnfx9\"" Mar 22 00:12:48 crc kubenswrapper[5116]: I0322 00:12:48.665746 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Mar 22 00:12:48 crc kubenswrapper[5116]: I0322 00:12:48.667634 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-control-plane-metrics-cert\"" Mar 22 00:12:49 crc kubenswrapper[5116]: I0322 00:12:49.071720 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"marketplace-operator-metrics\"" Mar 22 00:12:49 crc kubenswrapper[5116]: I0322 00:12:49.108780 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"mcc-proxy-tls\"" Mar 22 00:12:49 crc kubenswrapper[5116]: I0322 00:12:49.150327 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Mar 22 00:12:49 crc kubenswrapper[5116]: I0322 00:12:49.188364 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Mar 22 00:12:49 crc kubenswrapper[5116]: I0322 00:12:49.357874 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Mar 22 00:12:49 crc kubenswrapper[5116]: I0322 00:12:49.443039 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-nwglk\"" Mar 22 00:12:49 crc kubenswrapper[5116]: I0322 00:12:49.446331 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-l2v2m\"" Mar 22 00:12:49 crc kubenswrapper[5116]: I0322 00:12:49.516077 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Mar 22 00:12:49 crc kubenswrapper[5116]: I0322 00:12:49.634186 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"hostpath-provisioner\"/\"openshift-service-ca.crt\"" Mar 22 00:12:49 crc kubenswrapper[5116]: I0322 00:12:49.766598 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6w67b\"" Mar 22 00:12:49 crc kubenswrapper[5116]: I0322 00:12:49.767099 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Mar 22 00:12:49 crc kubenswrapper[5116]: I0322 00:12:49.845023 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Mar 22 00:12:49 crc kubenswrapper[5116]: I0322 00:12:49.941232 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"node-bootstrapper-token\"" Mar 22 00:12:50 crc kubenswrapper[5116]: I0322 00:12:50.083119 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Mar 22 00:12:50 crc kubenswrapper[5116]: I0322 00:12:50.205069 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:12:50 crc kubenswrapper[5116]: I0322 00:12:50.250674 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-apiserver-operator\"/\"kube-apiserver-operator-dockercfg-bf7fj\"" Mar 22 00:12:50 crc kubenswrapper[5116]: I0322 00:12:50.475468 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-node-identity\"/\"openshift-service-ca.crt\"" Mar 22 00:12:50 crc kubenswrapper[5116]: I0322 00:12:50.980316 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-operator-dockercfg-sw6nc\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.043510 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-apiserver-operator\"/\"kube-root-ca.crt\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.045877 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-api\"/\"machine-api-operator-images\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.220865 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"redhat-operators-dockercfg-9gxlh\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.380631 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"image-import-ca\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.452325 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.493531 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.587103 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-tjs74\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.607468 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-kknhg\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.697203 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager-operator\"/\"openshift-controller-manager-operator-config\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.697658 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.753621 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.758261 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-node-identity\"/\"network-node-identity-cert\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.790740 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-t8n29\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.811623 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"marketplace-trusted-ca\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.820130 5116 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.836046 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=37.836011553 podStartE2EDuration="37.836011553s" podCreationTimestamp="2026-03-22 00:12:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:12:37.561398404 +0000 UTC m=+228.583699787" watchObservedRunningTime="2026-03-22 00:12:51.836011553 +0000 UTC m=+242.858312956" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.840295 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-66458b6674-8qfhd"] Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.840662 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-6b75ff674b-bdglf"] Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.841254 5116 kubelet.go:3323] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="80bbc05d-2ba5-48f3-8d94-3fcd0c0f12d9" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.841297 5116 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="80bbc05d-2ba5-48f3-8d94-3fcd0c0f12d9" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.841785 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" containerName="image-pruner" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.841877 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" containerName="image-pruner" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.841936 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73ebea9b-fc7b-4d54-af53-f6f61e0fce97" containerName="oauth-openshift" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.841996 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="73ebea9b-fc7b-4d54-af53-f6f61e0fce97" containerName="oauth-openshift" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.842101 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" containerName="installer" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.842189 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" containerName="installer" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.842417 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" containerName="image-pruner" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.842553 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" containerName="installer" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.842634 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="73ebea9b-fc7b-4d54-af53-f6f61e0fce97" containerName="oauth-openshift" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.851778 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.852279 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.854623 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-user-template-login\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.854842 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-router-certs\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.855068 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-session\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.855255 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-cliconfig\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.855498 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"openshift-service-ca.crt\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.855639 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-user-idp-0-file-data\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.855808 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"oauth-openshift-dockercfg-d2bf2\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.855959 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-service-ca\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.856762 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-serving-cert\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.856891 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"audit\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.857035 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-user-template-provider-selection\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.857455 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"kube-root-ca.crt\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.862649 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-user-template-error\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.873570 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-ocp-branding-template\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.876472 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.876449396 podStartE2EDuration="14.876449396s" podCreationTimestamp="2026-03-22 00:12:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:12:51.872979332 +0000 UTC m=+242.895280715" watchObservedRunningTime="2026-03-22 00:12:51.876449396 +0000 UTC m=+242.898750769" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.876842 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-trusted-ca-bundle\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.878920 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9pgs7\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.910103 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-service-ca\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.910160 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.910201 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.910371 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/58f868be-d7d6-4e45-96b8-49fb29023df0-audit-dir\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.910396 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/58f868be-d7d6-4e45-96b8-49fb29023df0-audit-policies\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.910419 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-session\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.910444 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-router-certs\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.910485 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.910509 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.910534 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz55q\" (UniqueName: \"kubernetes.io/projected/58f868be-d7d6-4e45-96b8-49fb29023df0-kube-api-access-vz55q\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.910557 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-user-template-login\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.910582 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.910597 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.910619 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-user-template-error\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.996919 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"community-operators-dockercfg-vrd5f\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.011515 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.011586 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vz55q\" (UniqueName: \"kubernetes.io/projected/58f868be-d7d6-4e45-96b8-49fb29023df0-kube-api-access-vz55q\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.011622 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-user-template-login\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.011647 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.011665 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.011697 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-user-template-error\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.011718 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-service-ca\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.011745 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.011760 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.011790 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/58f868be-d7d6-4e45-96b8-49fb29023df0-audit-dir\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.011808 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/58f868be-d7d6-4e45-96b8-49fb29023df0-audit-policies\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.011829 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-session\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.011852 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-router-certs\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.011890 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.012370 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/58f868be-d7d6-4e45-96b8-49fb29023df0-audit-dir\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.013415 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/58f868be-d7d6-4e45-96b8-49fb29023df0-audit-policies\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.013994 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.014227 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.014402 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-service-ca\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.018787 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-user-template-error\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.018839 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-session\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.018849 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-router-certs\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.019299 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.019555 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.020492 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-user-template-login\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.023060 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.023084 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.039790 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz55q\" (UniqueName: \"kubernetes.io/projected/58f868be-d7d6-4e45-96b8-49fb29023df0-kube-api-access-vz55q\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.096759 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"openshift-global-ca\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.137842 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"proxy-tls\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.172114 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"service-ca-bundle\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.178178 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.233004 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.254046 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"etcd-serving-ca\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.273158 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-8dkm8\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.373486 5116 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.396297 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.502183 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-bgxvm\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.517266 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"audit-1\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.573501 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-api\"/\"kube-root-ca.crt\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.599857 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.622731 5116 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.675710 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver-operator\"/\"openshift-apiserver-operator-serving-cert\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.711080 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-etcd-operator\"/\"etcd-client\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.744398 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-machine-approver\"/\"kube-root-ca.crt\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.748147 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-operator\"/\"trusted-ca\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.816809 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-node-identity\"/\"kube-root-ca.crt\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.877326 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-oauth-apiserver\"/\"encryption-config-1\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.877430 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.891439 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-machine-approver\"/\"machine-approver-sa-dockercfg-wzhvk\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.893489 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"authentication-operator-config\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.924521 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"etcd-operator-config\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.933260 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.959032 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.006630 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"kube-root-ca.crt\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.057274 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.057353 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.082970 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"config\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.135797 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"kube-root-ca.crt\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.302388 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-kl6m8\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.374549 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.379898 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.438986 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"redhat-marketplace-dockercfg-gg4w7\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.483386 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-operator\"/\"ingress-operator-dockercfg-74nwh\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.653479 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-route-controller-manager\"/\"kube-root-ca.crt\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.686184 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.708102 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73ebea9b-fc7b-4d54-af53-f6f61e0fce97" path="/var/lib/kubelet/pods/73ebea9b-fc7b-4d54-af53-f6f61e0fce97/volumes" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.708826 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.724995 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication-operator\"/\"authentication-operator-dockercfg-6tbpn\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.760751 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-machine-approver\"/\"kube-rbac-proxy\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.767622 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.804285 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.890534 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-oauth-apiserver\"/\"etcd-client\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.890542 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-operator\"/\"kube-root-ca.crt\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.891906 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.005635 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.011313 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-version\"/\"cluster-version-operator-serving-cert\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.013667 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"catalog-operator-serving-cert\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.123308 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-kw8fx\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.166144 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-config-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.201767 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-node-identity\"/\"ovnkube-identity-cm\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.212581 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.220454 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.256230 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"kube-root-ca.crt\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.288002 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"packageserver-service-cert\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.378693 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.380493 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-server-tls\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.453064 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-etcd-operator\"/\"etcd-operator-dockercfg-4vdnc\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.528969 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"olm-operator-serving-cert\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.569092 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-config-operator\"/\"kube-rbac-proxy\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.773768 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"etcd-serving-ca\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.869329 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"cluster-image-registry-operator-dockercfg-ntnd7\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.875740 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-etcd-operator\"/\"etcd-operator-serving-cert\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.998017 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-config-operator\"/\"openshift-config-operator-dockercfg-sjn6s\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.998742 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.038552 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-mdwwj\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.045644 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.053071 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"openshift-service-ca.crt\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.171887 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ac-dockercfg-gj7jx\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.186149 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-2h6bs\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.203267 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.236671 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-route-controller-manager\"/\"client-ca\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.295552 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-kpvmz\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.322947 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-route-controller-manager\"/\"route-controller-manager-sa-dockercfg-mmcpt\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.389632 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.441592 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-operator\"/\"metrics-tls\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.495711 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-machine-approver\"/\"machine-approver-config\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.541957 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.588795 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.736574 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.776508 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.781645 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver-operator\"/\"openshift-apiserver-operator-config\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.834106 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns-operator\"/\"dns-operator-dockercfg-wbbsn\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.901969 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-api\"/\"kube-rbac-proxy\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.929837 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.948255 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-api\"/\"control-plane-machine-set-operator-dockercfg-gnx66\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.954483 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"hostpath-provisioner\"/\"csi-hostpath-provisioner-sa-dockercfg-7dcws\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.140163 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-version\"/\"default-dockercfg-hqpm5\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.267448 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.379356 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-oauth-apiserver\"/\"serving-cert\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.392838 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns-operator\"/\"metrics-tls\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.394905 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.441672 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"trusted-ca-bundle\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.494822 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-control-plane-dockercfg-nl8tp\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.543021 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-route-controller-manager\"/\"config\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.556758 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.560057 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-version\"/\"openshift-service-ca.crt\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.598087 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver\"/\"etcd-client\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.599878 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns-operator\"/\"kube-root-ca.crt\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.614908 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-certs-default\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.659534 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.686847 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"openshift-service-ca.crt\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.850230 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.905355 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.930885 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.945996 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-machine-approver\"/\"openshift-service-ca.crt\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.956285 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.001593 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver\"/\"openshift-apiserver-sa-dockercfg-4zqgh\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.008736 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.039128 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-node-identity\"/\"env-overrides\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.113699 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-api\"/\"machine-api-operator-tls\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.137696 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-config-operator\"/\"kube-root-ca.crt\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.180043 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.181779 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"certified-operators-dockercfg-7cl8d\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.184343 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"kube-root-ca.crt\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.185621 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.321772 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.451196 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-apiserver-operator\"/\"kube-apiserver-operator-serving-cert\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.484139 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"openshift-service-ca.crt\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.593213 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-api\"/\"openshift-service-ca.crt\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.639027 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-controller-manager-operator\"/\"openshift-controller-manager-operator-serving-cert\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.658855 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.778654 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-jmhxf\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.827376 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"audit-1\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.834845 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver-operator\"/\"kube-root-ca.crt\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.856473 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-config-operator\"/\"kube-root-ca.crt\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.972300 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Mar 22 00:12:58 crc kubenswrapper[5116]: I0322 00:12:58.118822 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Mar 22 00:12:58 crc kubenswrapper[5116]: I0322 00:12:58.228729 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-machine-approver\"/\"machine-approver-tls\"" Mar 22 00:12:58 crc kubenswrapper[5116]: I0322 00:12:58.393290 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-controller-manager-operator\"/\"kube-root-ca.crt\"" Mar 22 00:12:58 crc kubenswrapper[5116]: I0322 00:12:58.482672 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager-operator\"/\"kube-root-ca.crt\"" Mar 22 00:12:58 crc kubenswrapper[5116]: I0322 00:12:58.483948 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-controller-manager\"/\"serving-cert\"" Mar 22 00:12:58 crc kubenswrapper[5116]: I0322 00:12:58.578911 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-config-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:12:58 crc kubenswrapper[5116]: I0322 00:12:58.582036 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Mar 22 00:12:58 crc kubenswrapper[5116]: I0322 00:12:58.786830 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-operator-images\"" Mar 22 00:12:58 crc kubenswrapper[5116]: I0322 00:12:58.863162 5116 kubelet.go:2547] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 22 00:12:58 crc kubenswrapper[5116]: I0322 00:12:58.863760 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" containerName="startup-monitor" containerID="cri-o://5cb08176408e0f1bc306dc6b2afd3ce9b0c721a85dbb42d8ee9b826aa3eecff5" gracePeriod=5 Mar 22 00:12:58 crc kubenswrapper[5116]: I0322 00:12:58.878454 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-apiserver-operator\"/\"kube-apiserver-operator-config\"" Mar 22 00:12:58 crc kubenswrapper[5116]: I0322 00:12:58.886923 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"trusted-ca-bundle\"" Mar 22 00:12:58 crc kubenswrapper[5116]: I0322 00:12:58.938656 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.032865 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"etcd-ca-bundle\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.169519 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"marketplace-operator-dockercfg-2cfkp\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.213792 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.241643 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"metrics-tls\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.266094 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver\"/\"serving-cert\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.324848 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-controller-manager-operator\"/\"kube-controller-manager-operator-config\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.392523 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.445398 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-route-controller-manager\"/\"openshift-service-ca.crt\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.557819 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.565016 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-g6kgg\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.600294 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-server-dockercfg-dzw6b\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.615548 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.618179 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.624727 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"hostpath-provisioner\"/\"kube-root-ca.crt\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.679240 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"etcd-service-ca-bundle\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.714128 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-route-controller-manager\"/\"serving-cert\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.943342 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-admission-controller-secret\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.950235 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-controller-manager\"/\"openshift-controller-manager-sa-dockercfg-djmfg\"" Mar 22 00:13:00 crc kubenswrapper[5116]: I0322 00:13:00.007896 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-api\"/\"machine-api-operator-dockercfg-6n5ln\"" Mar 22 00:13:00 crc kubenswrapper[5116]: I0322 00:13:00.031589 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Mar 22 00:13:00 crc kubenswrapper[5116]: I0322 00:13:00.117892 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-api\"/\"control-plane-machine-set-operator-tls\"" Mar 22 00:13:00 crc kubenswrapper[5116]: I0322 00:13:00.510012 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication-operator\"/\"serving-cert\"" Mar 22 00:13:00 crc kubenswrapper[5116]: I0322 00:13:00.536131 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"kube-root-ca.crt\"" Mar 22 00:13:00 crc kubenswrapper[5116]: I0322 00:13:00.655582 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"kube-root-ca.crt\"" Mar 22 00:13:00 crc kubenswrapper[5116]: I0322 00:13:00.715780 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Mar 22 00:13:00 crc kubenswrapper[5116]: I0322 00:13:00.757567 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Mar 22 00:13:00 crc kubenswrapper[5116]: I0322 00:13:00.771047 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:13:00 crc kubenswrapper[5116]: I0322 00:13:00.787656 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"olm-operator-serviceaccount-dockercfg-4gqzj\"" Mar 22 00:13:00 crc kubenswrapper[5116]: I0322 00:13:00.827577 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Mar 22 00:13:00 crc kubenswrapper[5116]: I0322 00:13:00.839633 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6b75ff674b-bdglf"] Mar 22 00:13:00 crc kubenswrapper[5116]: I0322 00:13:00.856517 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-bjqfd\"" Mar 22 00:13:00 crc kubenswrapper[5116]: I0322 00:13:00.975255 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-version\"/\"kube-root-ca.crt\"" Mar 22 00:13:01 crc kubenswrapper[5116]: I0322 00:13:01.021963 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:13:01 crc kubenswrapper[5116]: I0322 00:13:01.029990 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-controller-dockercfg-xnj77\"" Mar 22 00:13:01 crc kubenswrapper[5116]: I0322 00:13:01.124476 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6b75ff674b-bdglf"] Mar 22 00:13:01 crc kubenswrapper[5116]: I0322 00:13:01.201310 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:13:01 crc kubenswrapper[5116]: I0322 00:13:01.201935 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver-operator\"/\"openshift-apiserver-operator-dockercfg-6c46w\"" Mar 22 00:13:01 crc kubenswrapper[5116]: I0322 00:13:01.347861 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-controller-manager-operator\"/\"openshift-controller-manager-operator-dockercfg-jcmfj\"" Mar 22 00:13:01 crc kubenswrapper[5116]: I0322 00:13:01.358625 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" event={"ID":"58f868be-d7d6-4e45-96b8-49fb29023df0","Type":"ContainerStarted","Data":"8fe53e940e9eae97000069792a4b484816e6c60558cfba780da749fcb2830fe5"} Mar 22 00:13:01 crc kubenswrapper[5116]: I0322 00:13:01.384015 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:13:01 crc kubenswrapper[5116]: I0322 00:13:01.417746 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Mar 22 00:13:01 crc kubenswrapper[5116]: I0322 00:13:01.425088 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-operator-tls\"" Mar 22 00:13:01 crc kubenswrapper[5116]: I0322 00:13:01.463055 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"openshift-service-ca.crt\"" Mar 22 00:13:01 crc kubenswrapper[5116]: I0322 00:13:01.479028 5116 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Mar 22 00:13:01 crc kubenswrapper[5116]: I0322 00:13:01.539210 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"package-server-manager-serving-cert\"" Mar 22 00:13:01 crc kubenswrapper[5116]: I0322 00:13:01.570448 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"mco-proxy-tls\"" Mar 22 00:13:01 crc kubenswrapper[5116]: I0322 00:13:01.852754 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-scheduler-operator\"/\"kube-scheduler-operator-serving-cert\"" Mar 22 00:13:01 crc kubenswrapper[5116]: I0322 00:13:01.927275 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Mar 22 00:13:01 crc kubenswrapper[5116]: I0322 00:13:01.998692 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-scheduler-operator\"/\"openshift-kube-scheduler-operator-dockercfg-2wbn2\"" Mar 22 00:13:02 crc kubenswrapper[5116]: I0322 00:13:02.032135 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"trusted-ca-bundle\"" Mar 22 00:13:02 crc kubenswrapper[5116]: I0322 00:13:02.032302 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-config-operator\"/\"config-operator-serving-cert\"" Mar 22 00:13:02 crc kubenswrapper[5116]: I0322 00:13:02.071505 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-tk7bt\"" Mar 22 00:13:02 crc kubenswrapper[5116]: I0322 00:13:02.248911 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"client-ca\"" Mar 22 00:13:02 crc kubenswrapper[5116]: I0322 00:13:02.274631 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-scheduler-operator\"/\"kube-root-ca.crt\"" Mar 22 00:13:02 crc kubenswrapper[5116]: I0322 00:13:02.290908 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-controller-manager-operator\"/\"kube-controller-manager-operator-serving-cert\"" Mar 22 00:13:02 crc kubenswrapper[5116]: I0322 00:13:02.365380 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" event={"ID":"58f868be-d7d6-4e45-96b8-49fb29023df0","Type":"ContainerStarted","Data":"c4c2eb53d08e3ad6c104a996b921878333870c0fc1c07a0372d96ac12b016924"} Mar 22 00:13:02 crc kubenswrapper[5116]: I0322 00:13:02.365753 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:13:02 crc kubenswrapper[5116]: I0322 00:13:02.368444 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-scheduler-operator\"/\"openshift-kube-scheduler-operator-config\"" Mar 22 00:13:02 crc kubenswrapper[5116]: I0322 00:13:02.373669 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:13:02 crc kubenswrapper[5116]: I0322 00:13:02.388691 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" podStartSLOduration=63.388673777 podStartE2EDuration="1m3.388673777s" podCreationTimestamp="2026-03-22 00:11:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:13:02.387518658 +0000 UTC m=+253.409820021" watchObservedRunningTime="2026-03-22 00:13:02.388673777 +0000 UTC m=+253.410975150" Mar 22 00:13:02 crc kubenswrapper[5116]: I0322 00:13:02.587612 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"config\"" Mar 22 00:13:02 crc kubenswrapper[5116]: I0322 00:13:02.732341 5116 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.377243 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f7dbc7e1ee9c187a863ef9b473fad27b/startup-monitor/0.log" Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.377457 5116 generic.go:358] "Generic (PLEG): container finished" podID="f7dbc7e1ee9c187a863ef9b473fad27b" containerID="5cb08176408e0f1bc306dc6b2afd3ce9b0c721a85dbb42d8ee9b826aa3eecff5" exitCode=137 Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.432747 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f7dbc7e1ee9c187a863ef9b473fad27b/startup-monitor/0.log" Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.432857 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.569278 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-resource-dir\") pod \"f7dbc7e1ee9c187a863ef9b473fad27b\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.569406 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-manifests\") pod \"f7dbc7e1ee9c187a863ef9b473fad27b\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.569401 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f7dbc7e1ee9c187a863ef9b473fad27b" (UID: "f7dbc7e1ee9c187a863ef9b473fad27b"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.569451 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-lock\") pod \"f7dbc7e1ee9c187a863ef9b473fad27b\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.569490 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-pod-resource-dir\") pod \"f7dbc7e1ee9c187a863ef9b473fad27b\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.569596 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-log\") pod \"f7dbc7e1ee9c187a863ef9b473fad27b\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.569488 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-manifests" (OuterVolumeSpecName: "manifests") pod "f7dbc7e1ee9c187a863ef9b473fad27b" (UID: "f7dbc7e1ee9c187a863ef9b473fad27b"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.569522 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-lock" (OuterVolumeSpecName: "var-lock") pod "f7dbc7e1ee9c187a863ef9b473fad27b" (UID: "f7dbc7e1ee9c187a863ef9b473fad27b"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.569731 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-log" (OuterVolumeSpecName: "var-log") pod "f7dbc7e1ee9c187a863ef9b473fad27b" (UID: "f7dbc7e1ee9c187a863ef9b473fad27b"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.570529 5116 reconciler_common.go:299] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.570579 5116 reconciler_common.go:299] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-manifests\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.570609 5116 reconciler_common.go:299] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-lock\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.570631 5116 reconciler_common.go:299] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-log\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.582841 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f7dbc7e1ee9c187a863ef9b473fad27b" (UID: "f7dbc7e1ee9c187a863ef9b473fad27b"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.672034 5116 reconciler_common.go:299] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:05 crc kubenswrapper[5116]: I0322 00:13:05.384655 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f7dbc7e1ee9c187a863ef9b473fad27b/startup-monitor/0.log" Mar 22 00:13:05 crc kubenswrapper[5116]: I0322 00:13:05.384992 5116 scope.go:117] "RemoveContainer" containerID="5cb08176408e0f1bc306dc6b2afd3ce9b0c721a85dbb42d8ee9b826aa3eecff5" Mar 22 00:13:05 crc kubenswrapper[5116]: I0322 00:13:05.385201 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:13:05 crc kubenswrapper[5116]: I0322 00:13:05.704972 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" path="/var/lib/kubelet/pods/f7dbc7e1ee9c187a863ef9b473fad27b/volumes" Mar 22 00:13:05 crc kubenswrapper[5116]: I0322 00:13:05.705514 5116 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 22 00:13:05 crc kubenswrapper[5116]: I0322 00:13:05.716663 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 22 00:13:05 crc kubenswrapper[5116]: I0322 00:13:05.716710 5116 kubelet.go:2759] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="76689ad2-1282-4ff1-bc50-5fa527bbd4d2" Mar 22 00:13:05 crc kubenswrapper[5116]: I0322 00:13:05.720011 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 22 00:13:05 crc kubenswrapper[5116]: I0322 00:13:05.720049 5116 kubelet.go:2784] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="76689ad2-1282-4ff1-bc50-5fa527bbd4d2" Mar 22 00:13:13 crc kubenswrapper[5116]: I0322 00:13:13.914579 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver\"/\"encryption-config-1\"" Mar 22 00:13:16 crc kubenswrapper[5116]: I0322 00:13:16.304975 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Mar 22 00:13:17 crc kubenswrapper[5116]: I0322 00:13:17.621909 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-65b6cccf98-9kdkj"] Mar 22 00:13:17 crc kubenswrapper[5116]: I0322 00:13:17.623020 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" podUID="ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513" containerName="controller-manager" containerID="cri-o://f1fb21485888c6408efe6382b9ad0e01400a965e3e20a4bb5ea08ffa6332ecde" gracePeriod=30 Mar 22 00:13:17 crc kubenswrapper[5116]: I0322 00:13:17.640359 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5"] Mar 22 00:13:17 crc kubenswrapper[5116]: I0322 00:13:17.640894 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" podUID="5f51f3b4-6887-42b5-ad77-5a2f349a162a" containerName="route-controller-manager" containerID="cri-o://9114a650032efd74a5e2a932e1a4e1b3463dd65c69ea801f51e4fffd08b48235" gracePeriod=30 Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.011839 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.027616 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.040568 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6"] Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.041160 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f51f3b4-6887-42b5-ad77-5a2f349a162a" containerName="route-controller-manager" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.041206 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f51f3b4-6887-42b5-ad77-5a2f349a162a" containerName="route-controller-manager" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.041237 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" containerName="startup-monitor" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.041245 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" containerName="startup-monitor" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.041271 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513" containerName="controller-manager" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.041279 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513" containerName="controller-manager" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.041375 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="5f51f3b4-6887-42b5-ad77-5a2f349a162a" containerName="route-controller-manager" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.041395 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513" containerName="controller-manager" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.041406 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" containerName="startup-monitor" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.044856 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.055322 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6"] Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.065723 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88"] Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.074844 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.080814 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88"] Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.144608 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-tmp\") pod \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.144666 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-serving-cert\") pod \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.144756 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvntc\" (UniqueName: \"kubernetes.io/projected/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-kube-api-access-wvntc\") pod \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.144819 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f51f3b4-6887-42b5-ad77-5a2f349a162a-serving-cert\") pod \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.144847 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp728\" (UniqueName: \"kubernetes.io/projected/5f51f3b4-6887-42b5-ad77-5a2f349a162a-kube-api-access-qp728\") pod \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.144882 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5f51f3b4-6887-42b5-ad77-5a2f349a162a-client-ca\") pod \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.144931 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5f51f3b4-6887-42b5-ad77-5a2f349a162a-tmp\") pod \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.144951 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-client-ca\") pod \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.144995 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-proxy-ca-bundles\") pod \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.145030 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-config\") pod \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.145054 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f51f3b4-6887-42b5-ad77-5a2f349a162a-config\") pod \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.145090 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-tmp" (OuterVolumeSpecName: "tmp") pod "ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513" (UID: "ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.145587 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f51f3b4-6887-42b5-ad77-5a2f349a162a-tmp" (OuterVolumeSpecName: "tmp") pod "5f51f3b4-6887-42b5-ad77-5a2f349a162a" (UID: "5f51f3b4-6887-42b5-ad77-5a2f349a162a"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.145627 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-serving-cert\") pod \"route-controller-manager-6bd55b6c86-s5xv6\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.145721 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f51f3b4-6887-42b5-ad77-5a2f349a162a-client-ca" (OuterVolumeSpecName: "client-ca") pod "5f51f3b4-6887-42b5-ad77-5a2f349a162a" (UID: "5f51f3b4-6887-42b5-ad77-5a2f349a162a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.145726 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-client-ca\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.145820 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-config\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.145852 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/51169795-1332-4ee1-94c0-c2f58d62de92-tmp\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.145946 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-tmp\") pod \"route-controller-manager-6bd55b6c86-s5xv6\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.145989 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-config\") pod \"route-controller-manager-6bd55b6c86-s5xv6\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.146033 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-client-ca\") pod \"route-controller-manager-6bd55b6c86-s5xv6\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.146227 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-proxy-ca-bundles\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.146242 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f51f3b4-6887-42b5-ad77-5a2f349a162a-config" (OuterVolumeSpecName: "config") pod "5f51f3b4-6887-42b5-ad77-5a2f349a162a" (UID: "5f51f3b4-6887-42b5-ad77-5a2f349a162a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.146354 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51169795-1332-4ee1-94c0-c2f58d62de92-serving-cert\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.146422 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd44d\" (UniqueName: \"kubernetes.io/projected/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-kube-api-access-vd44d\") pod \"route-controller-manager-6bd55b6c86-s5xv6\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.146446 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvplp\" (UniqueName: \"kubernetes.io/projected/51169795-1332-4ee1-94c0-c2f58d62de92-kube-api-access-mvplp\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.146570 5116 reconciler_common.go:299] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5f51f3b4-6887-42b5-ad77-5a2f349a162a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.146591 5116 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5f51f3b4-6887-42b5-ad77-5a2f349a162a-tmp\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.146603 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f51f3b4-6887-42b5-ad77-5a2f349a162a-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.146615 5116 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-tmp\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.146766 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-client-ca" (OuterVolumeSpecName: "client-ca") pod "ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513" (UID: "ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.146814 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513" (UID: "ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.146881 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-config" (OuterVolumeSpecName: "config") pod "ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513" (UID: "ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.151841 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-kube-api-access-wvntc" (OuterVolumeSpecName: "kube-api-access-wvntc") pod "ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513" (UID: "ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513"). InnerVolumeSpecName "kube-api-access-wvntc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.152001 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f51f3b4-6887-42b5-ad77-5a2f349a162a-kube-api-access-qp728" (OuterVolumeSpecName: "kube-api-access-qp728") pod "5f51f3b4-6887-42b5-ad77-5a2f349a162a" (UID: "5f51f3b4-6887-42b5-ad77-5a2f349a162a"). InnerVolumeSpecName "kube-api-access-qp728". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.152082 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f51f3b4-6887-42b5-ad77-5a2f349a162a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5f51f3b4-6887-42b5-ad77-5a2f349a162a" (UID: "5f51f3b4-6887-42b5-ad77-5a2f349a162a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.152230 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513" (UID: "ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.247945 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-client-ca\") pod \"route-controller-manager-6bd55b6c86-s5xv6\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.248026 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-proxy-ca-bundles\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.248077 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51169795-1332-4ee1-94c0-c2f58d62de92-serving-cert\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.248117 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vd44d\" (UniqueName: \"kubernetes.io/projected/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-kube-api-access-vd44d\") pod \"route-controller-manager-6bd55b6c86-s5xv6\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.248370 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mvplp\" (UniqueName: \"kubernetes.io/projected/51169795-1332-4ee1-94c0-c2f58d62de92-kube-api-access-mvplp\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.248463 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-serving-cert\") pod \"route-controller-manager-6bd55b6c86-s5xv6\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.248526 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-client-ca\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.248576 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-config\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.248604 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/51169795-1332-4ee1-94c0-c2f58d62de92-tmp\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.249032 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-tmp\") pod \"route-controller-manager-6bd55b6c86-s5xv6\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.251293 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-config\") pod \"route-controller-manager-6bd55b6c86-s5xv6\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.249034 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-client-ca\") pod \"route-controller-manager-6bd55b6c86-s5xv6\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.250341 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-proxy-ca-bundles\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.251309 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/51169795-1332-4ee1-94c0-c2f58d62de92-tmp\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.249388 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-tmp\") pod \"route-controller-manager-6bd55b6c86-s5xv6\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.251466 5116 reconciler_common.go:299] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.251518 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.251536 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.251552 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wvntc\" (UniqueName: \"kubernetes.io/projected/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-kube-api-access-wvntc\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.251594 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f51f3b4-6887-42b5-ad77-5a2f349a162a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.251609 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qp728\" (UniqueName: \"kubernetes.io/projected/5f51f3b4-6887-42b5-ad77-5a2f349a162a-kube-api-access-qp728\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.251622 5116 reconciler_common.go:299] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-client-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.252111 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-config\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.252303 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-client-ca\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.252407 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-config\") pod \"route-controller-manager-6bd55b6c86-s5xv6\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.254078 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51169795-1332-4ee1-94c0-c2f58d62de92-serving-cert\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.254212 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-serving-cert\") pod \"route-controller-manager-6bd55b6c86-s5xv6\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.266054 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd44d\" (UniqueName: \"kubernetes.io/projected/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-kube-api-access-vd44d\") pod \"route-controller-manager-6bd55b6c86-s5xv6\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.267355 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvplp\" (UniqueName: \"kubernetes.io/projected/51169795-1332-4ee1-94c0-c2f58d62de92-kube-api-access-mvplp\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.378854 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.394912 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.472916 5116 generic.go:358] "Generic (PLEG): container finished" podID="5f51f3b4-6887-42b5-ad77-5a2f349a162a" containerID="9114a650032efd74a5e2a932e1a4e1b3463dd65c69ea801f51e4fffd08b48235" exitCode=0 Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.473078 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" event={"ID":"5f51f3b4-6887-42b5-ad77-5a2f349a162a","Type":"ContainerDied","Data":"9114a650032efd74a5e2a932e1a4e1b3463dd65c69ea801f51e4fffd08b48235"} Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.473114 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" event={"ID":"5f51f3b4-6887-42b5-ad77-5a2f349a162a","Type":"ContainerDied","Data":"b7695e77d66bbd117e7c0ab58ba5188e43f6d887ec5e97574683b7c2f512fe56"} Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.473140 5116 scope.go:117] "RemoveContainer" containerID="9114a650032efd74a5e2a932e1a4e1b3463dd65c69ea801f51e4fffd08b48235" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.475512 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.486125 5116 generic.go:358] "Generic (PLEG): container finished" podID="23e39fb8-29b4-4a99-b189-3cd7c8e7f488" containerID="a6ce4d9e4fad533d2bf3efd803ec0a13bd8f8eba0dea6f48dfd40baebf8e74cc" exitCode=0 Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.486217 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" event={"ID":"23e39fb8-29b4-4a99-b189-3cd7c8e7f488","Type":"ContainerDied","Data":"a6ce4d9e4fad533d2bf3efd803ec0a13bd8f8eba0dea6f48dfd40baebf8e74cc"} Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.487792 5116 generic.go:358] "Generic (PLEG): container finished" podID="ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513" containerID="f1fb21485888c6408efe6382b9ad0e01400a965e3e20a4bb5ea08ffa6332ecde" exitCode=0 Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.487936 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" event={"ID":"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513","Type":"ContainerDied","Data":"f1fb21485888c6408efe6382b9ad0e01400a965e3e20a4bb5ea08ffa6332ecde"} Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.488061 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" event={"ID":"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513","Type":"ContainerDied","Data":"4a8a88fe9fa050abb0479c637d1e4e232ab389aa2a939c4d9f3135fe99408731"} Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.488273 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.532659 5116 scope.go:117] "RemoveContainer" containerID="a6ce4d9e4fad533d2bf3efd803ec0a13bd8f8eba0dea6f48dfd40baebf8e74cc" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.570618 5116 scope.go:117] "RemoveContainer" containerID="9114a650032efd74a5e2a932e1a4e1b3463dd65c69ea801f51e4fffd08b48235" Mar 22 00:13:18 crc kubenswrapper[5116]: E0322 00:13:18.571180 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9114a650032efd74a5e2a932e1a4e1b3463dd65c69ea801f51e4fffd08b48235\": container with ID starting with 9114a650032efd74a5e2a932e1a4e1b3463dd65c69ea801f51e4fffd08b48235 not found: ID does not exist" containerID="9114a650032efd74a5e2a932e1a4e1b3463dd65c69ea801f51e4fffd08b48235" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.571233 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9114a650032efd74a5e2a932e1a4e1b3463dd65c69ea801f51e4fffd08b48235"} err="failed to get container status \"9114a650032efd74a5e2a932e1a4e1b3463dd65c69ea801f51e4fffd08b48235\": rpc error: code = NotFound desc = could not find container \"9114a650032efd74a5e2a932e1a4e1b3463dd65c69ea801f51e4fffd08b48235\": container with ID starting with 9114a650032efd74a5e2a932e1a4e1b3463dd65c69ea801f51e4fffd08b48235 not found: ID does not exist" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.571258 5116 scope.go:117] "RemoveContainer" containerID="f1fb21485888c6408efe6382b9ad0e01400a965e3e20a4bb5ea08ffa6332ecde" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.604648 5116 scope.go:117] "RemoveContainer" containerID="f1fb21485888c6408efe6382b9ad0e01400a965e3e20a4bb5ea08ffa6332ecde" Mar 22 00:13:18 crc kubenswrapper[5116]: E0322 00:13:18.607045 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1fb21485888c6408efe6382b9ad0e01400a965e3e20a4bb5ea08ffa6332ecde\": container with ID starting with f1fb21485888c6408efe6382b9ad0e01400a965e3e20a4bb5ea08ffa6332ecde not found: ID does not exist" containerID="f1fb21485888c6408efe6382b9ad0e01400a965e3e20a4bb5ea08ffa6332ecde" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.607098 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1fb21485888c6408efe6382b9ad0e01400a965e3e20a4bb5ea08ffa6332ecde"} err="failed to get container status \"f1fb21485888c6408efe6382b9ad0e01400a965e3e20a4bb5ea08ffa6332ecde\": rpc error: code = NotFound desc = could not find container \"f1fb21485888c6408efe6382b9ad0e01400a965e3e20a4bb5ea08ffa6332ecde\": container with ID starting with f1fb21485888c6408efe6382b9ad0e01400a965e3e20a4bb5ea08ffa6332ecde not found: ID does not exist" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.628543 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5"] Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.633067 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5"] Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.644199 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-65b6cccf98-9kdkj"] Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.650160 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-65b6cccf98-9kdkj"] Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.693828 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6"] Mar 22 00:13:18 crc kubenswrapper[5116]: W0322 00:13:18.700423 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd576ea5f_6d26_4c36_8d3d_1efeed9d5691.slice/crio-3fc069942f55aab73b08b732d7bfd6ba14281b7bcdf089d7e4a46f49bc7f68af WatchSource:0}: Error finding container 3fc069942f55aab73b08b732d7bfd6ba14281b7bcdf089d7e4a46f49bc7f68af: Status 404 returned error can't find the container with id 3fc069942f55aab73b08b732d7bfd6ba14281b7bcdf089d7e4a46f49bc7f68af Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.750108 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88"] Mar 22 00:13:18 crc kubenswrapper[5116]: W0322 00:13:18.751927 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51169795_1332_4ee1_94c0_c2f58d62de92.slice/crio-89205bef1b635d4e40295ff0aadf34fc6300396f7696726f6eed2a32f962fb39 WatchSource:0}: Error finding container 89205bef1b635d4e40295ff0aadf34fc6300396f7696726f6eed2a32f962fb39: Status 404 returned error can't find the container with id 89205bef1b635d4e40295ff0aadf34fc6300396f7696726f6eed2a32f962fb39 Mar 22 00:13:19 crc kubenswrapper[5116]: I0322 00:13:19.494265 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" event={"ID":"51169795-1332-4ee1-94c0-c2f58d62de92","Type":"ContainerStarted","Data":"7563ae0b361008c176c05f6e7bd3fc9a0ee5e090cb5690b0be49c13d5c140fad"} Mar 22 00:13:19 crc kubenswrapper[5116]: I0322 00:13:19.494729 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" event={"ID":"51169795-1332-4ee1-94c0-c2f58d62de92","Type":"ContainerStarted","Data":"89205bef1b635d4e40295ff0aadf34fc6300396f7696726f6eed2a32f962fb39"} Mar 22 00:13:19 crc kubenswrapper[5116]: I0322 00:13:19.494770 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:19 crc kubenswrapper[5116]: I0322 00:13:19.496883 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" event={"ID":"d576ea5f-6d26-4c36-8d3d-1efeed9d5691","Type":"ContainerStarted","Data":"556604124c289c80f20b9f13945f1d1714f519db60aa3df20ac283467865a9c3"} Mar 22 00:13:19 crc kubenswrapper[5116]: I0322 00:13:19.496922 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" event={"ID":"d576ea5f-6d26-4c36-8d3d-1efeed9d5691","Type":"ContainerStarted","Data":"3fc069942f55aab73b08b732d7bfd6ba14281b7bcdf089d7e4a46f49bc7f68af"} Mar 22 00:13:19 crc kubenswrapper[5116]: I0322 00:13:19.497138 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:19 crc kubenswrapper[5116]: I0322 00:13:19.500889 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:19 crc kubenswrapper[5116]: I0322 00:13:19.501328 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" event={"ID":"23e39fb8-29b4-4a99-b189-3cd7c8e7f488","Type":"ContainerStarted","Data":"a871149953aa28693bec7b6f245c349dbf7ca02a0950b383ad44700084b4c923"} Mar 22 00:13:19 crc kubenswrapper[5116]: I0322 00:13:19.501761 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:13:19 crc kubenswrapper[5116]: I0322 00:13:19.507827 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:19 crc kubenswrapper[5116]: I0322 00:13:19.507974 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:13:19 crc kubenswrapper[5116]: I0322 00:13:19.543260 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" podStartSLOduration=2.543240684 podStartE2EDuration="2.543240684s" podCreationTimestamp="2026-03-22 00:13:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:13:19.518985448 +0000 UTC m=+270.541286921" watchObservedRunningTime="2026-03-22 00:13:19.543240684 +0000 UTC m=+270.565542067" Mar 22 00:13:19 crc kubenswrapper[5116]: I0322 00:13:19.552907 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" podStartSLOduration=2.5528916649999998 podStartE2EDuration="2.552891665s" podCreationTimestamp="2026-03-22 00:13:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:13:19.551312509 +0000 UTC m=+270.573613882" watchObservedRunningTime="2026-03-22 00:13:19.552891665 +0000 UTC m=+270.575193038" Mar 22 00:13:19 crc kubenswrapper[5116]: I0322 00:13:19.703499 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f51f3b4-6887-42b5-ad77-5a2f349a162a" path="/var/lib/kubelet/pods/5f51f3b4-6887-42b5-ad77-5a2f349a162a/volumes" Mar 22 00:13:19 crc kubenswrapper[5116]: I0322 00:13:19.704251 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513" path="/var/lib/kubelet/pods/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513/volumes" Mar 22 00:13:21 crc kubenswrapper[5116]: I0322 00:13:21.331974 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88"] Mar 22 00:13:21 crc kubenswrapper[5116]: I0322 00:13:21.341950 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6"] Mar 22 00:13:22 crc kubenswrapper[5116]: I0322 00:13:22.522885 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" podUID="d576ea5f-6d26-4c36-8d3d-1efeed9d5691" containerName="route-controller-manager" containerID="cri-o://556604124c289c80f20b9f13945f1d1714f519db60aa3df20ac283467865a9c3" gracePeriod=30 Mar 22 00:13:22 crc kubenswrapper[5116]: I0322 00:13:22.523516 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" podUID="51169795-1332-4ee1-94c0-c2f58d62de92" containerName="controller-manager" containerID="cri-o://7563ae0b361008c176c05f6e7bd3fc9a0ee5e090cb5690b0be49c13d5c140fad" gracePeriod=30 Mar 22 00:13:22 crc kubenswrapper[5116]: I0322 00:13:22.809126 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-oauth-apiserver\"/\"oauth-apiserver-sa-dockercfg-qqw4z\"" Mar 22 00:13:22 crc kubenswrapper[5116]: I0322 00:13:22.889274 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:22 crc kubenswrapper[5116]: I0322 00:13:22.911774 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:22 crc kubenswrapper[5116]: I0322 00:13:22.918009 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7c55489586-7pf6k"] Mar 22 00:13:22 crc kubenswrapper[5116]: I0322 00:13:22.918546 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51169795-1332-4ee1-94c0-c2f58d62de92" containerName="controller-manager" Mar 22 00:13:22 crc kubenswrapper[5116]: I0322 00:13:22.918566 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="51169795-1332-4ee1-94c0-c2f58d62de92" containerName="controller-manager" Mar 22 00:13:22 crc kubenswrapper[5116]: I0322 00:13:22.918588 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d576ea5f-6d26-4c36-8d3d-1efeed9d5691" containerName="route-controller-manager" Mar 22 00:13:22 crc kubenswrapper[5116]: I0322 00:13:22.918595 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="d576ea5f-6d26-4c36-8d3d-1efeed9d5691" containerName="route-controller-manager" Mar 22 00:13:22 crc kubenswrapper[5116]: I0322 00:13:22.918704 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="d576ea5f-6d26-4c36-8d3d-1efeed9d5691" containerName="route-controller-manager" Mar 22 00:13:22 crc kubenswrapper[5116]: I0322 00:13:22.918717 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="51169795-1332-4ee1-94c0-c2f58d62de92" containerName="controller-manager" Mar 22 00:13:22 crc kubenswrapper[5116]: I0322 00:13:22.925435 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:22 crc kubenswrapper[5116]: I0322 00:13:22.959313 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c55489586-7pf6k"] Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.009053 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649"] Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.013387 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649"] Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.013528 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.015970 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-serving-cert\") pod \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.016030 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-client-ca\") pod \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.016080 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-proxy-ca-bundles\") pod \"51169795-1332-4ee1-94c0-c2f58d62de92\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.016106 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/51169795-1332-4ee1-94c0-c2f58d62de92-tmp\") pod \"51169795-1332-4ee1-94c0-c2f58d62de92\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.016188 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-client-ca\") pod \"51169795-1332-4ee1-94c0-c2f58d62de92\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.016233 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-tmp\") pod \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.016258 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-config\") pod \"51169795-1332-4ee1-94c0-c2f58d62de92\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.016337 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51169795-1332-4ee1-94c0-c2f58d62de92-serving-cert\") pod \"51169795-1332-4ee1-94c0-c2f58d62de92\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.016365 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvplp\" (UniqueName: \"kubernetes.io/projected/51169795-1332-4ee1-94c0-c2f58d62de92-kube-api-access-mvplp\") pod \"51169795-1332-4ee1-94c0-c2f58d62de92\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.016421 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-config\") pod \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.016485 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd44d\" (UniqueName: \"kubernetes.io/projected/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-kube-api-access-vd44d\") pod \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.016687 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/086874e0-a0bb-4e3d-b08a-ff841931a631-config\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.016737 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25s54\" (UniqueName: \"kubernetes.io/projected/086874e0-a0bb-4e3d-b08a-ff841931a631-kube-api-access-25s54\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.016768 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/086874e0-a0bb-4e3d-b08a-ff841931a631-serving-cert\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.016792 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/086874e0-a0bb-4e3d-b08a-ff841931a631-proxy-ca-bundles\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.016820 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/086874e0-a0bb-4e3d-b08a-ff841931a631-tmp\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.016852 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/086874e0-a0bb-4e3d-b08a-ff841931a631-client-ca\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.019589 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-client-ca" (OuterVolumeSpecName: "client-ca") pod "51169795-1332-4ee1-94c0-c2f58d62de92" (UID: "51169795-1332-4ee1-94c0-c2f58d62de92"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.019828 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-client-ca" (OuterVolumeSpecName: "client-ca") pod "d576ea5f-6d26-4c36-8d3d-1efeed9d5691" (UID: "d576ea5f-6d26-4c36-8d3d-1efeed9d5691"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.020097 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "51169795-1332-4ee1-94c0-c2f58d62de92" (UID: "51169795-1332-4ee1-94c0-c2f58d62de92"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.020244 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51169795-1332-4ee1-94c0-c2f58d62de92-tmp" (OuterVolumeSpecName: "tmp") pod "51169795-1332-4ee1-94c0-c2f58d62de92" (UID: "51169795-1332-4ee1-94c0-c2f58d62de92"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.020580 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-config" (OuterVolumeSpecName: "config") pod "51169795-1332-4ee1-94c0-c2f58d62de92" (UID: "51169795-1332-4ee1-94c0-c2f58d62de92"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.020706 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-tmp" (OuterVolumeSpecName: "tmp") pod "d576ea5f-6d26-4c36-8d3d-1efeed9d5691" (UID: "d576ea5f-6d26-4c36-8d3d-1efeed9d5691"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.020765 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-config" (OuterVolumeSpecName: "config") pod "d576ea5f-6d26-4c36-8d3d-1efeed9d5691" (UID: "d576ea5f-6d26-4c36-8d3d-1efeed9d5691"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.023549 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51169795-1332-4ee1-94c0-c2f58d62de92-kube-api-access-mvplp" (OuterVolumeSpecName: "kube-api-access-mvplp") pod "51169795-1332-4ee1-94c0-c2f58d62de92" (UID: "51169795-1332-4ee1-94c0-c2f58d62de92"). InnerVolumeSpecName "kube-api-access-mvplp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.024216 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51169795-1332-4ee1-94c0-c2f58d62de92-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "51169795-1332-4ee1-94c0-c2f58d62de92" (UID: "51169795-1332-4ee1-94c0-c2f58d62de92"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.024302 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d576ea5f-6d26-4c36-8d3d-1efeed9d5691" (UID: "d576ea5f-6d26-4c36-8d3d-1efeed9d5691"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.025750 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-kube-api-access-vd44d" (OuterVolumeSpecName: "kube-api-access-vd44d") pod "d576ea5f-6d26-4c36-8d3d-1efeed9d5691" (UID: "d576ea5f-6d26-4c36-8d3d-1efeed9d5691"). InnerVolumeSpecName "kube-api-access-vd44d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.057432 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.057499 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.057547 5116 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.058112 5116 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a25cfaae8e3e082964edf4aaff4c07221b6f7fe72f8c4b2ecedb1fe877eab638"} pod="openshift-machine-config-operator/machine-config-daemon-66g6d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.058197 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" containerID="cri-o://a25cfaae8e3e082964edf4aaff4c07221b6f7fe72f8c4b2ecedb1fe877eab638" gracePeriod=600 Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117550 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07a2bcaa-7073-4b67-bd66-80d71ec35171-config\") pod \"route-controller-manager-7667c5b846-cb649\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117600 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2shz\" (UniqueName: \"kubernetes.io/projected/07a2bcaa-7073-4b67-bd66-80d71ec35171-kube-api-access-w2shz\") pod \"route-controller-manager-7667c5b846-cb649\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117649 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07a2bcaa-7073-4b67-bd66-80d71ec35171-serving-cert\") pod \"route-controller-manager-7667c5b846-cb649\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117694 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/086874e0-a0bb-4e3d-b08a-ff841931a631-config\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117724 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07a2bcaa-7073-4b67-bd66-80d71ec35171-client-ca\") pod \"route-controller-manager-7667c5b846-cb649\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117791 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-25s54\" (UniqueName: \"kubernetes.io/projected/086874e0-a0bb-4e3d-b08a-ff841931a631-kube-api-access-25s54\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117811 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/07a2bcaa-7073-4b67-bd66-80d71ec35171-tmp\") pod \"route-controller-manager-7667c5b846-cb649\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117840 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/086874e0-a0bb-4e3d-b08a-ff841931a631-serving-cert\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117858 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/086874e0-a0bb-4e3d-b08a-ff841931a631-proxy-ca-bundles\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117877 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/086874e0-a0bb-4e3d-b08a-ff841931a631-tmp\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117897 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/086874e0-a0bb-4e3d-b08a-ff841931a631-client-ca\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117944 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vd44d\" (UniqueName: \"kubernetes.io/projected/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-kube-api-access-vd44d\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117955 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117963 5116 reconciler_common.go:299] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-client-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117972 5116 reconciler_common.go:299] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117980 5116 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/51169795-1332-4ee1-94c0-c2f58d62de92-tmp\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117987 5116 reconciler_common.go:299] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-client-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117995 5116 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-tmp\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.118003 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.118011 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51169795-1332-4ee1-94c0-c2f58d62de92-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.118019 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mvplp\" (UniqueName: \"kubernetes.io/projected/51169795-1332-4ee1-94c0-c2f58d62de92-kube-api-access-mvplp\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.118027 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.118900 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/086874e0-a0bb-4e3d-b08a-ff841931a631-tmp\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.119305 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/086874e0-a0bb-4e3d-b08a-ff841931a631-config\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.119443 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/086874e0-a0bb-4e3d-b08a-ff841931a631-client-ca\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.119550 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/086874e0-a0bb-4e3d-b08a-ff841931a631-proxy-ca-bundles\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.121586 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/086874e0-a0bb-4e3d-b08a-ff841931a631-serving-cert\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.134361 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-25s54\" (UniqueName: \"kubernetes.io/projected/086874e0-a0bb-4e3d-b08a-ff841931a631-kube-api-access-25s54\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.218916 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07a2bcaa-7073-4b67-bd66-80d71ec35171-config\") pod \"route-controller-manager-7667c5b846-cb649\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.220576 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2shz\" (UniqueName: \"kubernetes.io/projected/07a2bcaa-7073-4b67-bd66-80d71ec35171-kube-api-access-w2shz\") pod \"route-controller-manager-7667c5b846-cb649\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.220481 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07a2bcaa-7073-4b67-bd66-80d71ec35171-config\") pod \"route-controller-manager-7667c5b846-cb649\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.222264 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07a2bcaa-7073-4b67-bd66-80d71ec35171-serving-cert\") pod \"route-controller-manager-7667c5b846-cb649\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.222319 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07a2bcaa-7073-4b67-bd66-80d71ec35171-client-ca\") pod \"route-controller-manager-7667c5b846-cb649\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.223098 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07a2bcaa-7073-4b67-bd66-80d71ec35171-client-ca\") pod \"route-controller-manager-7667c5b846-cb649\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.223237 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/07a2bcaa-7073-4b67-bd66-80d71ec35171-tmp\") pod \"route-controller-manager-7667c5b846-cb649\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.223602 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/07a2bcaa-7073-4b67-bd66-80d71ec35171-tmp\") pod \"route-controller-manager-7667c5b846-cb649\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.227533 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07a2bcaa-7073-4b67-bd66-80d71ec35171-serving-cert\") pod \"route-controller-manager-7667c5b846-cb649\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.240757 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2shz\" (UniqueName: \"kubernetes.io/projected/07a2bcaa-7073-4b67-bd66-80d71ec35171-kube-api-access-w2shz\") pod \"route-controller-manager-7667c5b846-cb649\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.280484 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.333906 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.485621 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c55489586-7pf6k"] Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.530753 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" event={"ID":"086874e0-a0bb-4e3d-b08a-ff841931a631","Type":"ContainerStarted","Data":"dc029588657b553f65a348cc7284f02ac2055add166776a383e724f1320a617e"} Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.533747 5116 generic.go:358] "Generic (PLEG): container finished" podID="51169795-1332-4ee1-94c0-c2f58d62de92" containerID="7563ae0b361008c176c05f6e7bd3fc9a0ee5e090cb5690b0be49c13d5c140fad" exitCode=0 Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.533934 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.536337 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" event={"ID":"51169795-1332-4ee1-94c0-c2f58d62de92","Type":"ContainerDied","Data":"7563ae0b361008c176c05f6e7bd3fc9a0ee5e090cb5690b0be49c13d5c140fad"} Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.536372 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" event={"ID":"51169795-1332-4ee1-94c0-c2f58d62de92","Type":"ContainerDied","Data":"89205bef1b635d4e40295ff0aadf34fc6300396f7696726f6eed2a32f962fb39"} Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.536402 5116 scope.go:117] "RemoveContainer" containerID="7563ae0b361008c176c05f6e7bd3fc9a0ee5e090cb5690b0be49c13d5c140fad" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.552261 5116 generic.go:358] "Generic (PLEG): container finished" podID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerID="a25cfaae8e3e082964edf4aaff4c07221b6f7fe72f8c4b2ecedb1fe877eab638" exitCode=0 Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.552389 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerDied","Data":"a25cfaae8e3e082964edf4aaff4c07221b6f7fe72f8c4b2ecedb1fe877eab638"} Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.552420 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerStarted","Data":"b51ee51711afcb946b670f0d81e2401692bf98b75a88b05923f26883062fdb6e"} Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.557314 5116 generic.go:358] "Generic (PLEG): container finished" podID="d576ea5f-6d26-4c36-8d3d-1efeed9d5691" containerID="556604124c289c80f20b9f13945f1d1714f519db60aa3df20ac283467865a9c3" exitCode=0 Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.557494 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" event={"ID":"d576ea5f-6d26-4c36-8d3d-1efeed9d5691","Type":"ContainerDied","Data":"556604124c289c80f20b9f13945f1d1714f519db60aa3df20ac283467865a9c3"} Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.557539 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" event={"ID":"d576ea5f-6d26-4c36-8d3d-1efeed9d5691","Type":"ContainerDied","Data":"3fc069942f55aab73b08b732d7bfd6ba14281b7bcdf089d7e4a46f49bc7f68af"} Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.557628 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.573386 5116 scope.go:117] "RemoveContainer" containerID="7563ae0b361008c176c05f6e7bd3fc9a0ee5e090cb5690b0be49c13d5c140fad" Mar 22 00:13:23 crc kubenswrapper[5116]: E0322 00:13:23.574041 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7563ae0b361008c176c05f6e7bd3fc9a0ee5e090cb5690b0be49c13d5c140fad\": container with ID starting with 7563ae0b361008c176c05f6e7bd3fc9a0ee5e090cb5690b0be49c13d5c140fad not found: ID does not exist" containerID="7563ae0b361008c176c05f6e7bd3fc9a0ee5e090cb5690b0be49c13d5c140fad" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.574085 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7563ae0b361008c176c05f6e7bd3fc9a0ee5e090cb5690b0be49c13d5c140fad"} err="failed to get container status \"7563ae0b361008c176c05f6e7bd3fc9a0ee5e090cb5690b0be49c13d5c140fad\": rpc error: code = NotFound desc = could not find container \"7563ae0b361008c176c05f6e7bd3fc9a0ee5e090cb5690b0be49c13d5c140fad\": container with ID starting with 7563ae0b361008c176c05f6e7bd3fc9a0ee5e090cb5690b0be49c13d5c140fad not found: ID does not exist" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.574108 5116 scope.go:117] "RemoveContainer" containerID="556604124c289c80f20b9f13945f1d1714f519db60aa3df20ac283467865a9c3" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.586806 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88"] Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.590072 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649"] Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.593008 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88"] Mar 22 00:13:23 crc kubenswrapper[5116]: W0322 00:13:23.600674 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07a2bcaa_7073_4b67_bd66_80d71ec35171.slice/crio-1eb6513c1a20aa0a7c68cfa5fa675cab6f1c6aa6437fc6fd4d30bd3bc79717da WatchSource:0}: Error finding container 1eb6513c1a20aa0a7c68cfa5fa675cab6f1c6aa6437fc6fd4d30bd3bc79717da: Status 404 returned error can't find the container with id 1eb6513c1a20aa0a7c68cfa5fa675cab6f1c6aa6437fc6fd4d30bd3bc79717da Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.617081 5116 scope.go:117] "RemoveContainer" containerID="556604124c289c80f20b9f13945f1d1714f519db60aa3df20ac283467865a9c3" Mar 22 00:13:23 crc kubenswrapper[5116]: E0322 00:13:23.618787 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"556604124c289c80f20b9f13945f1d1714f519db60aa3df20ac283467865a9c3\": container with ID starting with 556604124c289c80f20b9f13945f1d1714f519db60aa3df20ac283467865a9c3 not found: ID does not exist" containerID="556604124c289c80f20b9f13945f1d1714f519db60aa3df20ac283467865a9c3" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.618822 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"556604124c289c80f20b9f13945f1d1714f519db60aa3df20ac283467865a9c3"} err="failed to get container status \"556604124c289c80f20b9f13945f1d1714f519db60aa3df20ac283467865a9c3\": rpc error: code = NotFound desc = could not find container \"556604124c289c80f20b9f13945f1d1714f519db60aa3df20ac283467865a9c3\": container with ID starting with 556604124c289c80f20b9f13945f1d1714f519db60aa3df20ac283467865a9c3 not found: ID does not exist" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.624361 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6"] Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.629525 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6"] Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.703250 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51169795-1332-4ee1-94c0-c2f58d62de92" path="/var/lib/kubelet/pods/51169795-1332-4ee1-94c0-c2f58d62de92/volumes" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.706566 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d576ea5f-6d26-4c36-8d3d-1efeed9d5691" path="/var/lib/kubelet/pods/d576ea5f-6d26-4c36-8d3d-1efeed9d5691/volumes" Mar 22 00:13:24 crc kubenswrapper[5116]: I0322 00:13:24.566607 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" event={"ID":"086874e0-a0bb-4e3d-b08a-ff841931a631","Type":"ContainerStarted","Data":"daddf578ea7897b2fcc71cca1907b0429901ca877ff2e97584b0ce47cae89286"} Mar 22 00:13:24 crc kubenswrapper[5116]: I0322 00:13:24.567736 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:24 crc kubenswrapper[5116]: I0322 00:13:24.569535 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" event={"ID":"07a2bcaa-7073-4b67-bd66-80d71ec35171","Type":"ContainerStarted","Data":"ff4355c965e1f733ee45286e6806d457f6952cf87aeffaae72ed1bca331a3c15"} Mar 22 00:13:24 crc kubenswrapper[5116]: I0322 00:13:24.569561 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" event={"ID":"07a2bcaa-7073-4b67-bd66-80d71ec35171","Type":"ContainerStarted","Data":"1eb6513c1a20aa0a7c68cfa5fa675cab6f1c6aa6437fc6fd4d30bd3bc79717da"} Mar 22 00:13:24 crc kubenswrapper[5116]: I0322 00:13:24.570316 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:24 crc kubenswrapper[5116]: I0322 00:13:24.580343 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:24 crc kubenswrapper[5116]: I0322 00:13:24.589322 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" podStartSLOduration=3.589308178 podStartE2EDuration="3.589308178s" podCreationTimestamp="2026-03-22 00:13:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:13:24.587316167 +0000 UTC m=+275.609617580" watchObservedRunningTime="2026-03-22 00:13:24.589308178 +0000 UTC m=+275.611609551" Mar 22 00:13:24 crc kubenswrapper[5116]: I0322 00:13:24.625835 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" podStartSLOduration=2.6258122459999997 podStartE2EDuration="2.625812246s" podCreationTimestamp="2026-03-22 00:13:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:13:24.624543811 +0000 UTC m=+275.646845194" watchObservedRunningTime="2026-03-22 00:13:24.625812246 +0000 UTC m=+275.648113629" Mar 22 00:13:24 crc kubenswrapper[5116]: I0322 00:13:24.650393 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:26 crc kubenswrapper[5116]: I0322 00:13:26.535052 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"pprof-cert\"" Mar 22 00:13:30 crc kubenswrapper[5116]: I0322 00:13:30.867776 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-daemon-dockercfg-w9nzh\"" Mar 22 00:13:31 crc kubenswrapper[5116]: I0322 00:13:31.117461 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:13:36 crc kubenswrapper[5116]: I0322 00:13:36.261451 5116 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 22 00:13:37 crc kubenswrapper[5116]: I0322 00:13:37.643342 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649"] Mar 22 00:13:37 crc kubenswrapper[5116]: I0322 00:13:37.643618 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" podUID="07a2bcaa-7073-4b67-bd66-80d71ec35171" containerName="route-controller-manager" containerID="cri-o://ff4355c965e1f733ee45286e6806d457f6952cf87aeffaae72ed1bca331a3c15" gracePeriod=30 Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.081457 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.106181 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79"] Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.106953 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07a2bcaa-7073-4b67-bd66-80d71ec35171" containerName="route-controller-manager" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.106976 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a2bcaa-7073-4b67-bd66-80d71ec35171" containerName="route-controller-manager" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.107122 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="07a2bcaa-7073-4b67-bd66-80d71ec35171" containerName="route-controller-manager" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.114868 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.122476 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79"] Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.123046 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2shz\" (UniqueName: \"kubernetes.io/projected/07a2bcaa-7073-4b67-bd66-80d71ec35171-kube-api-access-w2shz\") pod \"07a2bcaa-7073-4b67-bd66-80d71ec35171\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.123148 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07a2bcaa-7073-4b67-bd66-80d71ec35171-config\") pod \"07a2bcaa-7073-4b67-bd66-80d71ec35171\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.123277 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/07a2bcaa-7073-4b67-bd66-80d71ec35171-tmp\") pod \"07a2bcaa-7073-4b67-bd66-80d71ec35171\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.123347 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07a2bcaa-7073-4b67-bd66-80d71ec35171-client-ca\") pod \"07a2bcaa-7073-4b67-bd66-80d71ec35171\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.123381 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07a2bcaa-7073-4b67-bd66-80d71ec35171-serving-cert\") pod \"07a2bcaa-7073-4b67-bd66-80d71ec35171\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.123557 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0cb0732-e531-42fd-a042-7d691a4292ed-config\") pod \"route-controller-manager-6c8bf966f9-49m79\" (UID: \"e0cb0732-e531-42fd-a042-7d691a4292ed\") " pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.123629 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0cb0732-e531-42fd-a042-7d691a4292ed-client-ca\") pod \"route-controller-manager-6c8bf966f9-49m79\" (UID: \"e0cb0732-e531-42fd-a042-7d691a4292ed\") " pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.123696 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2mx7\" (UniqueName: \"kubernetes.io/projected/e0cb0732-e531-42fd-a042-7d691a4292ed-kube-api-access-h2mx7\") pod \"route-controller-manager-6c8bf966f9-49m79\" (UID: \"e0cb0732-e531-42fd-a042-7d691a4292ed\") " pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.123736 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e0cb0732-e531-42fd-a042-7d691a4292ed-tmp\") pod \"route-controller-manager-6c8bf966f9-49m79\" (UID: \"e0cb0732-e531-42fd-a042-7d691a4292ed\") " pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.123765 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0cb0732-e531-42fd-a042-7d691a4292ed-serving-cert\") pod \"route-controller-manager-6c8bf966f9-49m79\" (UID: \"e0cb0732-e531-42fd-a042-7d691a4292ed\") " pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.128580 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07a2bcaa-7073-4b67-bd66-80d71ec35171-tmp" (OuterVolumeSpecName: "tmp") pod "07a2bcaa-7073-4b67-bd66-80d71ec35171" (UID: "07a2bcaa-7073-4b67-bd66-80d71ec35171"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.129200 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07a2bcaa-7073-4b67-bd66-80d71ec35171-client-ca" (OuterVolumeSpecName: "client-ca") pod "07a2bcaa-7073-4b67-bd66-80d71ec35171" (UID: "07a2bcaa-7073-4b67-bd66-80d71ec35171"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.129210 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07a2bcaa-7073-4b67-bd66-80d71ec35171-config" (OuterVolumeSpecName: "config") pod "07a2bcaa-7073-4b67-bd66-80d71ec35171" (UID: "07a2bcaa-7073-4b67-bd66-80d71ec35171"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.139074 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07a2bcaa-7073-4b67-bd66-80d71ec35171-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "07a2bcaa-7073-4b67-bd66-80d71ec35171" (UID: "07a2bcaa-7073-4b67-bd66-80d71ec35171"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.139078 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07a2bcaa-7073-4b67-bd66-80d71ec35171-kube-api-access-w2shz" (OuterVolumeSpecName: "kube-api-access-w2shz") pod "07a2bcaa-7073-4b67-bd66-80d71ec35171" (UID: "07a2bcaa-7073-4b67-bd66-80d71ec35171"). InnerVolumeSpecName "kube-api-access-w2shz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.224920 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0cb0732-e531-42fd-a042-7d691a4292ed-config\") pod \"route-controller-manager-6c8bf966f9-49m79\" (UID: \"e0cb0732-e531-42fd-a042-7d691a4292ed\") " pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.224985 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0cb0732-e531-42fd-a042-7d691a4292ed-client-ca\") pod \"route-controller-manager-6c8bf966f9-49m79\" (UID: \"e0cb0732-e531-42fd-a042-7d691a4292ed\") " pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.225032 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2mx7\" (UniqueName: \"kubernetes.io/projected/e0cb0732-e531-42fd-a042-7d691a4292ed-kube-api-access-h2mx7\") pod \"route-controller-manager-6c8bf966f9-49m79\" (UID: \"e0cb0732-e531-42fd-a042-7d691a4292ed\") " pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.225057 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e0cb0732-e531-42fd-a042-7d691a4292ed-tmp\") pod \"route-controller-manager-6c8bf966f9-49m79\" (UID: \"e0cb0732-e531-42fd-a042-7d691a4292ed\") " pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.225080 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0cb0732-e531-42fd-a042-7d691a4292ed-serving-cert\") pod \"route-controller-manager-6c8bf966f9-49m79\" (UID: \"e0cb0732-e531-42fd-a042-7d691a4292ed\") " pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.225123 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w2shz\" (UniqueName: \"kubernetes.io/projected/07a2bcaa-7073-4b67-bd66-80d71ec35171-kube-api-access-w2shz\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.225137 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07a2bcaa-7073-4b67-bd66-80d71ec35171-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.225146 5116 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/07a2bcaa-7073-4b67-bd66-80d71ec35171-tmp\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.225262 5116 reconciler_common.go:299] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07a2bcaa-7073-4b67-bd66-80d71ec35171-client-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.225489 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e0cb0732-e531-42fd-a042-7d691a4292ed-tmp\") pod \"route-controller-manager-6c8bf966f9-49m79\" (UID: \"e0cb0732-e531-42fd-a042-7d691a4292ed\") " pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.225272 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07a2bcaa-7073-4b67-bd66-80d71ec35171-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.225871 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0cb0732-e531-42fd-a042-7d691a4292ed-client-ca\") pod \"route-controller-manager-6c8bf966f9-49m79\" (UID: \"e0cb0732-e531-42fd-a042-7d691a4292ed\") " pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.226689 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0cb0732-e531-42fd-a042-7d691a4292ed-config\") pod \"route-controller-manager-6c8bf966f9-49m79\" (UID: \"e0cb0732-e531-42fd-a042-7d691a4292ed\") " pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.230143 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0cb0732-e531-42fd-a042-7d691a4292ed-serving-cert\") pod \"route-controller-manager-6c8bf966f9-49m79\" (UID: \"e0cb0732-e531-42fd-a042-7d691a4292ed\") " pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.240228 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2mx7\" (UniqueName: \"kubernetes.io/projected/e0cb0732-e531-42fd-a042-7d691a4292ed-kube-api-access-h2mx7\") pod \"route-controller-manager-6c8bf966f9-49m79\" (UID: \"e0cb0732-e531-42fd-a042-7d691a4292ed\") " pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.472194 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.661243 5116 generic.go:358] "Generic (PLEG): container finished" podID="07a2bcaa-7073-4b67-bd66-80d71ec35171" containerID="ff4355c965e1f733ee45286e6806d457f6952cf87aeffaae72ed1bca331a3c15" exitCode=0 Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.661353 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" event={"ID":"07a2bcaa-7073-4b67-bd66-80d71ec35171","Type":"ContainerDied","Data":"ff4355c965e1f733ee45286e6806d457f6952cf87aeffaae72ed1bca331a3c15"} Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.661690 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" event={"ID":"07a2bcaa-7073-4b67-bd66-80d71ec35171","Type":"ContainerDied","Data":"1eb6513c1a20aa0a7c68cfa5fa675cab6f1c6aa6437fc6fd4d30bd3bc79717da"} Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.661719 5116 scope.go:117] "RemoveContainer" containerID="ff4355c965e1f733ee45286e6806d457f6952cf87aeffaae72ed1bca331a3c15" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.661432 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.690187 5116 scope.go:117] "RemoveContainer" containerID="ff4355c965e1f733ee45286e6806d457f6952cf87aeffaae72ed1bca331a3c15" Mar 22 00:13:38 crc kubenswrapper[5116]: E0322 00:13:38.690720 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff4355c965e1f733ee45286e6806d457f6952cf87aeffaae72ed1bca331a3c15\": container with ID starting with ff4355c965e1f733ee45286e6806d457f6952cf87aeffaae72ed1bca331a3c15 not found: ID does not exist" containerID="ff4355c965e1f733ee45286e6806d457f6952cf87aeffaae72ed1bca331a3c15" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.690756 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff4355c965e1f733ee45286e6806d457f6952cf87aeffaae72ed1bca331a3c15"} err="failed to get container status \"ff4355c965e1f733ee45286e6806d457f6952cf87aeffaae72ed1bca331a3c15\": rpc error: code = NotFound desc = could not find container \"ff4355c965e1f733ee45286e6806d457f6952cf87aeffaae72ed1bca331a3c15\": container with ID starting with ff4355c965e1f733ee45286e6806d457f6952cf87aeffaae72ed1bca331a3c15 not found: ID does not exist" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.697914 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649"] Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.703356 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649"] Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.961935 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79"] Mar 22 00:13:39 crc kubenswrapper[5116]: I0322 00:13:39.671104 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" event={"ID":"e0cb0732-e531-42fd-a042-7d691a4292ed","Type":"ContainerStarted","Data":"20f72b984295860c2949f5663643d60c6697022c401296c0a5b1e5299f237b02"} Mar 22 00:13:39 crc kubenswrapper[5116]: I0322 00:13:39.671430 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" event={"ID":"e0cb0732-e531-42fd-a042-7d691a4292ed","Type":"ContainerStarted","Data":"2d8fcd89b7d6ed557a003f7743e9c81d54cb4051f50142ca977de2db85b69ed3"} Mar 22 00:13:39 crc kubenswrapper[5116]: I0322 00:13:39.692262 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" podStartSLOduration=2.692230781 podStartE2EDuration="2.692230781s" podCreationTimestamp="2026-03-22 00:13:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:13:39.68539663 +0000 UTC m=+290.707698003" watchObservedRunningTime="2026-03-22 00:13:39.692230781 +0000 UTC m=+290.714532204" Mar 22 00:13:39 crc kubenswrapper[5116]: I0322 00:13:39.717433 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07a2bcaa-7073-4b67-bd66-80d71ec35171" path="/var/lib/kubelet/pods/07a2bcaa-7073-4b67-bd66-80d71ec35171/volumes" Mar 22 00:13:40 crc kubenswrapper[5116]: I0322 00:13:40.676957 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:40 crc kubenswrapper[5116]: I0322 00:13:40.683766 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:40 crc kubenswrapper[5116]: I0322 00:13:40.700854 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Mar 22 00:13:49 crc kubenswrapper[5116]: I0322 00:13:49.846794 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 22 00:13:49 crc kubenswrapper[5116]: I0322 00:13:49.847873 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.287548 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t4x6l"] Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.288376 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t4x6l" podUID="da3b0eb3-e48f-4080-bfdc-522f18cf2876" containerName="registry-server" containerID="cri-o://fcd6dc8707ea620677d752dcbac5f99025670bc7fb912c816a48189214a21722" gracePeriod=30 Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.303452 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zrcmf"] Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.304031 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zrcmf" podUID="77380b82-4c44-4cfd-a7b1-e77b060af507" containerName="registry-server" containerID="cri-o://58537fdac1ecd65bbab90d0d56679e194c7535455e823fc1086706266b101727" gracePeriod=30 Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.312817 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-lf2zm"] Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.313111 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" podUID="23e39fb8-29b4-4a99-b189-3cd7c8e7f488" containerName="marketplace-operator" containerID="cri-o://a871149953aa28693bec7b6f245c349dbf7ca02a0950b383ad44700084b4c923" gracePeriod=30 Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.335142 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kp7rb"] Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.335542 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kp7rb" podUID="696eed68-bf2d-4bbd-865f-07998d61f8ab" containerName="registry-server" containerID="cri-o://c41be0d44abbf50de3e3033796a4335c2fb7b8a34b7a76e0caaf18024ae10985" gracePeriod=30 Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.341100 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wss9d"] Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.341547 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wss9d" podUID="fe41a890-8a59-4fc7-b392-b7bab2ad5832" containerName="registry-server" containerID="cri-o://a7f6a2282b3b24b75800d6fb74f2f5336f0fdfcc664abae1ceae752dc767bce9" gracePeriod=30 Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.349027 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-c8r5t"] Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.356697 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-c8r5t"] Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.356809 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.515978 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ba8a6a03-e32a-4121-86e1-d856ddf7a73b-marketplace-operator-metrics\") pod \"marketplace-operator-547dbd544d-c8r5t\" (UID: \"ba8a6a03-e32a-4121-86e1-d856ddf7a73b\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.516034 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ba8a6a03-e32a-4121-86e1-d856ddf7a73b-tmp\") pod \"marketplace-operator-547dbd544d-c8r5t\" (UID: \"ba8a6a03-e32a-4121-86e1-d856ddf7a73b\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.516075 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ba8a6a03-e32a-4121-86e1-d856ddf7a73b-marketplace-trusted-ca\") pod \"marketplace-operator-547dbd544d-c8r5t\" (UID: \"ba8a6a03-e32a-4121-86e1-d856ddf7a73b\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.516094 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mdmx\" (UniqueName: \"kubernetes.io/projected/ba8a6a03-e32a-4121-86e1-d856ddf7a73b-kube-api-access-2mdmx\") pod \"marketplace-operator-547dbd544d-c8r5t\" (UID: \"ba8a6a03-e32a-4121-86e1-d856ddf7a73b\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.617616 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ba8a6a03-e32a-4121-86e1-d856ddf7a73b-tmp\") pod \"marketplace-operator-547dbd544d-c8r5t\" (UID: \"ba8a6a03-e32a-4121-86e1-d856ddf7a73b\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.617691 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ba8a6a03-e32a-4121-86e1-d856ddf7a73b-marketplace-trusted-ca\") pod \"marketplace-operator-547dbd544d-c8r5t\" (UID: \"ba8a6a03-e32a-4121-86e1-d856ddf7a73b\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.617717 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mdmx\" (UniqueName: \"kubernetes.io/projected/ba8a6a03-e32a-4121-86e1-d856ddf7a73b-kube-api-access-2mdmx\") pod \"marketplace-operator-547dbd544d-c8r5t\" (UID: \"ba8a6a03-e32a-4121-86e1-d856ddf7a73b\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.617793 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ba8a6a03-e32a-4121-86e1-d856ddf7a73b-marketplace-operator-metrics\") pod \"marketplace-operator-547dbd544d-c8r5t\" (UID: \"ba8a6a03-e32a-4121-86e1-d856ddf7a73b\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.618362 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ba8a6a03-e32a-4121-86e1-d856ddf7a73b-tmp\") pod \"marketplace-operator-547dbd544d-c8r5t\" (UID: \"ba8a6a03-e32a-4121-86e1-d856ddf7a73b\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.618974 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ba8a6a03-e32a-4121-86e1-d856ddf7a73b-marketplace-trusted-ca\") pod \"marketplace-operator-547dbd544d-c8r5t\" (UID: \"ba8a6a03-e32a-4121-86e1-d856ddf7a73b\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.634298 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mdmx\" (UniqueName: \"kubernetes.io/projected/ba8a6a03-e32a-4121-86e1-d856ddf7a73b-kube-api-access-2mdmx\") pod \"marketplace-operator-547dbd544d-c8r5t\" (UID: \"ba8a6a03-e32a-4121-86e1-d856ddf7a73b\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.635350 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ba8a6a03-e32a-4121-86e1-d856ddf7a73b-marketplace-operator-metrics\") pod \"marketplace-operator-547dbd544d-c8r5t\" (UID: \"ba8a6a03-e32a-4121-86e1-d856ddf7a73b\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.722547 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" Mar 22 00:13:52 crc kubenswrapper[5116]: E0322 00:13:52.761417 5116 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c41be0d44abbf50de3e3033796a4335c2fb7b8a34b7a76e0caaf18024ae10985 is running failed: container process not found" containerID="c41be0d44abbf50de3e3033796a4335c2fb7b8a34b7a76e0caaf18024ae10985" cmd=["grpc_health_probe","-addr=:50051"] Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.761563 5116 generic.go:358] "Generic (PLEG): container finished" podID="696eed68-bf2d-4bbd-865f-07998d61f8ab" containerID="c41be0d44abbf50de3e3033796a4335c2fb7b8a34b7a76e0caaf18024ae10985" exitCode=0 Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.761594 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kp7rb" event={"ID":"696eed68-bf2d-4bbd-865f-07998d61f8ab","Type":"ContainerDied","Data":"c41be0d44abbf50de3e3033796a4335c2fb7b8a34b7a76e0caaf18024ae10985"} Mar 22 00:13:52 crc kubenswrapper[5116]: E0322 00:13:52.770589 5116 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c41be0d44abbf50de3e3033796a4335c2fb7b8a34b7a76e0caaf18024ae10985 is running failed: container process not found" containerID="c41be0d44abbf50de3e3033796a4335c2fb7b8a34b7a76e0caaf18024ae10985" cmd=["grpc_health_probe","-addr=:50051"] Mar 22 00:13:52 crc kubenswrapper[5116]: E0322 00:13:52.772024 5116 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c41be0d44abbf50de3e3033796a4335c2fb7b8a34b7a76e0caaf18024ae10985 is running failed: container process not found" containerID="c41be0d44abbf50de3e3033796a4335c2fb7b8a34b7a76e0caaf18024ae10985" cmd=["grpc_health_probe","-addr=:50051"] Mar 22 00:13:52 crc kubenswrapper[5116]: E0322 00:13:52.772089 5116 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c41be0d44abbf50de3e3033796a4335c2fb7b8a34b7a76e0caaf18024ae10985 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-kp7rb" podUID="696eed68-bf2d-4bbd-865f-07998d61f8ab" containerName="registry-server" probeResult="unknown" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.780189 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t4x6l" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.787027 5116 generic.go:358] "Generic (PLEG): container finished" podID="fe41a890-8a59-4fc7-b392-b7bab2ad5832" containerID="a7f6a2282b3b24b75800d6fb74f2f5336f0fdfcc664abae1ceae752dc767bce9" exitCode=0 Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.787125 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wss9d" event={"ID":"fe41a890-8a59-4fc7-b392-b7bab2ad5832","Type":"ContainerDied","Data":"a7f6a2282b3b24b75800d6fb74f2f5336f0fdfcc664abae1ceae752dc767bce9"} Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.804604 5116 generic.go:358] "Generic (PLEG): container finished" podID="23e39fb8-29b4-4a99-b189-3cd7c8e7f488" containerID="a871149953aa28693bec7b6f245c349dbf7ca02a0950b383ad44700084b4c923" exitCode=0 Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.804948 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" event={"ID":"23e39fb8-29b4-4a99-b189-3cd7c8e7f488","Type":"ContainerDied","Data":"a871149953aa28693bec7b6f245c349dbf7ca02a0950b383ad44700084b4c923"} Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.804990 5116 scope.go:117] "RemoveContainer" containerID="a6ce4d9e4fad533d2bf3efd803ec0a13bd8f8eba0dea6f48dfd40baebf8e74cc" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.835287 5116 generic.go:358] "Generic (PLEG): container finished" podID="77380b82-4c44-4cfd-a7b1-e77b060af507" containerID="58537fdac1ecd65bbab90d0d56679e194c7535455e823fc1086706266b101727" exitCode=0 Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.835362 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrcmf" event={"ID":"77380b82-4c44-4cfd-a7b1-e77b060af507","Type":"ContainerDied","Data":"58537fdac1ecd65bbab90d0d56679e194c7535455e823fc1086706266b101727"} Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.843457 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t4x6l" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.842404 5116 generic.go:358] "Generic (PLEG): container finished" podID="da3b0eb3-e48f-4080-bfdc-522f18cf2876" containerID="fcd6dc8707ea620677d752dcbac5f99025670bc7fb912c816a48189214a21722" exitCode=0 Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.845556 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4x6l" event={"ID":"da3b0eb3-e48f-4080-bfdc-522f18cf2876","Type":"ContainerDied","Data":"fcd6dc8707ea620677d752dcbac5f99025670bc7fb912c816a48189214a21722"} Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.876387 5116 scope.go:117] "RemoveContainer" containerID="fcd6dc8707ea620677d752dcbac5f99025670bc7fb912c816a48189214a21722" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.906919 5116 scope.go:117] "RemoveContainer" containerID="bd585f6a2418bf617978e44c5ded778fb5ab883949c7c6d99346b0cce7aab8d6" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.907406 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wss9d" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.924127 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da3b0eb3-e48f-4080-bfdc-522f18cf2876-utilities\") pod \"da3b0eb3-e48f-4080-bfdc-522f18cf2876\" (UID: \"da3b0eb3-e48f-4080-bfdc-522f18cf2876\") " Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.924367 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da3b0eb3-e48f-4080-bfdc-522f18cf2876-catalog-content\") pod \"da3b0eb3-e48f-4080-bfdc-522f18cf2876\" (UID: \"da3b0eb3-e48f-4080-bfdc-522f18cf2876\") " Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.924408 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh2v2\" (UniqueName: \"kubernetes.io/projected/da3b0eb3-e48f-4080-bfdc-522f18cf2876-kube-api-access-zh2v2\") pod \"da3b0eb3-e48f-4080-bfdc-522f18cf2876\" (UID: \"da3b0eb3-e48f-4080-bfdc-522f18cf2876\") " Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.926148 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da3b0eb3-e48f-4080-bfdc-522f18cf2876-utilities" (OuterVolumeSpecName: "utilities") pod "da3b0eb3-e48f-4080-bfdc-522f18cf2876" (UID: "da3b0eb3-e48f-4080-bfdc-522f18cf2876"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.932181 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da3b0eb3-e48f-4080-bfdc-522f18cf2876-kube-api-access-zh2v2" (OuterVolumeSpecName: "kube-api-access-zh2v2") pod "da3b0eb3-e48f-4080-bfdc-522f18cf2876" (UID: "da3b0eb3-e48f-4080-bfdc-522f18cf2876"). InnerVolumeSpecName "kube-api-access-zh2v2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.933151 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kp7rb" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.935756 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zrcmf" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.940759 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.941579 5116 scope.go:117] "RemoveContainer" containerID="48d478e51d471e4228cfd3bf987de6d6edc345926b49b645231a761cba1fbba7" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.981194 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da3b0eb3-e48f-4080-bfdc-522f18cf2876-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da3b0eb3-e48f-4080-bfdc-522f18cf2876" (UID: "da3b0eb3-e48f-4080-bfdc-522f18cf2876"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.025940 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe41a890-8a59-4fc7-b392-b7bab2ad5832-catalog-content\") pod \"fe41a890-8a59-4fc7-b392-b7bab2ad5832\" (UID: \"fe41a890-8a59-4fc7-b392-b7bab2ad5832\") " Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026023 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8ktl\" (UniqueName: \"kubernetes.io/projected/fe41a890-8a59-4fc7-b392-b7bab2ad5832-kube-api-access-x8ktl\") pod \"fe41a890-8a59-4fc7-b392-b7bab2ad5832\" (UID: \"fe41a890-8a59-4fc7-b392-b7bab2ad5832\") " Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026058 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-marketplace-trusted-ca\") pod \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\" (UID: \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\") " Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026089 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe41a890-8a59-4fc7-b392-b7bab2ad5832-utilities\") pod \"fe41a890-8a59-4fc7-b392-b7bab2ad5832\" (UID: \"fe41a890-8a59-4fc7-b392-b7bab2ad5832\") " Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026120 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-tmp\") pod \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\" (UID: \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\") " Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026185 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77380b82-4c44-4cfd-a7b1-e77b060af507-catalog-content\") pod \"77380b82-4c44-4cfd-a7b1-e77b060af507\" (UID: \"77380b82-4c44-4cfd-a7b1-e77b060af507\") " Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026262 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77380b82-4c44-4cfd-a7b1-e77b060af507-utilities\") pod \"77380b82-4c44-4cfd-a7b1-e77b060af507\" (UID: \"77380b82-4c44-4cfd-a7b1-e77b060af507\") " Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026289 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/696eed68-bf2d-4bbd-865f-07998d61f8ab-catalog-content\") pod \"696eed68-bf2d-4bbd-865f-07998d61f8ab\" (UID: \"696eed68-bf2d-4bbd-865f-07998d61f8ab\") " Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026311 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/696eed68-bf2d-4bbd-865f-07998d61f8ab-utilities\") pod \"696eed68-bf2d-4bbd-865f-07998d61f8ab\" (UID: \"696eed68-bf2d-4bbd-865f-07998d61f8ab\") " Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026345 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cpmn\" (UniqueName: \"kubernetes.io/projected/77380b82-4c44-4cfd-a7b1-e77b060af507-kube-api-access-6cpmn\") pod \"77380b82-4c44-4cfd-a7b1-e77b060af507\" (UID: \"77380b82-4c44-4cfd-a7b1-e77b060af507\") " Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026375 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-marketplace-operator-metrics\") pod \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\" (UID: \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\") " Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026416 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwwx5\" (UniqueName: \"kubernetes.io/projected/696eed68-bf2d-4bbd-865f-07998d61f8ab-kube-api-access-vwwx5\") pod \"696eed68-bf2d-4bbd-865f-07998d61f8ab\" (UID: \"696eed68-bf2d-4bbd-865f-07998d61f8ab\") " Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026431 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-tmp" (OuterVolumeSpecName: "tmp") pod "23e39fb8-29b4-4a99-b189-3cd7c8e7f488" (UID: "23e39fb8-29b4-4a99-b189-3cd7c8e7f488"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026477 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crwp9\" (UniqueName: \"kubernetes.io/projected/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-kube-api-access-crwp9\") pod \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\" (UID: \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\") " Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026674 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da3b0eb3-e48f-4080-bfdc-522f18cf2876-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026686 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zh2v2\" (UniqueName: \"kubernetes.io/projected/da3b0eb3-e48f-4080-bfdc-522f18cf2876-kube-api-access-zh2v2\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026695 5116 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-tmp\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026704 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da3b0eb3-e48f-4080-bfdc-522f18cf2876-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026740 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "23e39fb8-29b4-4a99-b189-3cd7c8e7f488" (UID: "23e39fb8-29b4-4a99-b189-3cd7c8e7f488"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.027469 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe41a890-8a59-4fc7-b392-b7bab2ad5832-utilities" (OuterVolumeSpecName: "utilities") pod "fe41a890-8a59-4fc7-b392-b7bab2ad5832" (UID: "fe41a890-8a59-4fc7-b392-b7bab2ad5832"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.027961 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/696eed68-bf2d-4bbd-865f-07998d61f8ab-utilities" (OuterVolumeSpecName: "utilities") pod "696eed68-bf2d-4bbd-865f-07998d61f8ab" (UID: "696eed68-bf2d-4bbd-865f-07998d61f8ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.028736 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77380b82-4c44-4cfd-a7b1-e77b060af507-utilities" (OuterVolumeSpecName: "utilities") pod "77380b82-4c44-4cfd-a7b1-e77b060af507" (UID: "77380b82-4c44-4cfd-a7b1-e77b060af507"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.028902 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe41a890-8a59-4fc7-b392-b7bab2ad5832-kube-api-access-x8ktl" (OuterVolumeSpecName: "kube-api-access-x8ktl") pod "fe41a890-8a59-4fc7-b392-b7bab2ad5832" (UID: "fe41a890-8a59-4fc7-b392-b7bab2ad5832"). InnerVolumeSpecName "kube-api-access-x8ktl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.030178 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77380b82-4c44-4cfd-a7b1-e77b060af507-kube-api-access-6cpmn" (OuterVolumeSpecName: "kube-api-access-6cpmn") pod "77380b82-4c44-4cfd-a7b1-e77b060af507" (UID: "77380b82-4c44-4cfd-a7b1-e77b060af507"). InnerVolumeSpecName "kube-api-access-6cpmn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.030550 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/696eed68-bf2d-4bbd-865f-07998d61f8ab-kube-api-access-vwwx5" (OuterVolumeSpecName: "kube-api-access-vwwx5") pod "696eed68-bf2d-4bbd-865f-07998d61f8ab" (UID: "696eed68-bf2d-4bbd-865f-07998d61f8ab"). InnerVolumeSpecName "kube-api-access-vwwx5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.030734 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-kube-api-access-crwp9" (OuterVolumeSpecName: "kube-api-access-crwp9") pod "23e39fb8-29b4-4a99-b189-3cd7c8e7f488" (UID: "23e39fb8-29b4-4a99-b189-3cd7c8e7f488"). InnerVolumeSpecName "kube-api-access-crwp9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.032979 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "23e39fb8-29b4-4a99-b189-3cd7c8e7f488" (UID: "23e39fb8-29b4-4a99-b189-3cd7c8e7f488"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.052677 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/696eed68-bf2d-4bbd-865f-07998d61f8ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "696eed68-bf2d-4bbd-865f-07998d61f8ab" (UID: "696eed68-bf2d-4bbd-865f-07998d61f8ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.078971 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77380b82-4c44-4cfd-a7b1-e77b060af507-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "77380b82-4c44-4cfd-a7b1-e77b060af507" (UID: "77380b82-4c44-4cfd-a7b1-e77b060af507"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.127729 5116 reconciler_common.go:299] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.127772 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vwwx5\" (UniqueName: \"kubernetes.io/projected/696eed68-bf2d-4bbd-865f-07998d61f8ab-kube-api-access-vwwx5\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.127790 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-crwp9\" (UniqueName: \"kubernetes.io/projected/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-kube-api-access-crwp9\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.127810 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x8ktl\" (UniqueName: \"kubernetes.io/projected/fe41a890-8a59-4fc7-b392-b7bab2ad5832-kube-api-access-x8ktl\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.127829 5116 reconciler_common.go:299] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.127846 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe41a890-8a59-4fc7-b392-b7bab2ad5832-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.127863 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77380b82-4c44-4cfd-a7b1-e77b060af507-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.127881 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77380b82-4c44-4cfd-a7b1-e77b060af507-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.127897 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/696eed68-bf2d-4bbd-865f-07998d61f8ab-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.127912 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/696eed68-bf2d-4bbd-865f-07998d61f8ab-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.127929 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6cpmn\" (UniqueName: \"kubernetes.io/projected/77380b82-4c44-4cfd-a7b1-e77b060af507-kube-api-access-6cpmn\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.130160 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe41a890-8a59-4fc7-b392-b7bab2ad5832-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fe41a890-8a59-4fc7-b392-b7bab2ad5832" (UID: "fe41a890-8a59-4fc7-b392-b7bab2ad5832"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.170422 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-c8r5t"] Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.181013 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t4x6l"] Mar 22 00:13:53 crc kubenswrapper[5116]: W0322 00:13:53.184269 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba8a6a03_e32a_4121_86e1_d856ddf7a73b.slice/crio-1a36953a68e288a1ce0802063a89fd0b717cb006290f3ef643ee98debd8d1d95 WatchSource:0}: Error finding container 1a36953a68e288a1ce0802063a89fd0b717cb006290f3ef643ee98debd8d1d95: Status 404 returned error can't find the container with id 1a36953a68e288a1ce0802063a89fd0b717cb006290f3ef643ee98debd8d1d95 Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.186032 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t4x6l"] Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.187557 5116 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.229039 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe41a890-8a59-4fc7-b392-b7bab2ad5832-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.704128 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da3b0eb3-e48f-4080-bfdc-522f18cf2876" path="/var/lib/kubelet/pods/da3b0eb3-e48f-4080-bfdc-522f18cf2876/volumes" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.869409 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kp7rb" event={"ID":"696eed68-bf2d-4bbd-865f-07998d61f8ab","Type":"ContainerDied","Data":"82447639f8664a7a9be68c50329975aae58c8919cedaf2554c6f5ebb2a14ac22"} Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.869482 5116 scope.go:117] "RemoveContainer" containerID="c41be0d44abbf50de3e3033796a4335c2fb7b8a34b7a76e0caaf18024ae10985" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.869532 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kp7rb" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.875086 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wss9d" event={"ID":"fe41a890-8a59-4fc7-b392-b7bab2ad5832","Type":"ContainerDied","Data":"4076d8a97891e463c22fe9847edcf67c692b44a8dbcd9aa75ba00b5c2c7fdc81"} Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.876355 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" event={"ID":"23e39fb8-29b4-4a99-b189-3cd7c8e7f488","Type":"ContainerDied","Data":"082b4604427fbc7c5d9bce23172c03602291dda6ccea1696f1f624d1746d3739"} Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.876475 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.878499 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wss9d" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.878727 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrcmf" event={"ID":"77380b82-4c44-4cfd-a7b1-e77b060af507","Type":"ContainerDied","Data":"c8dbd89a41371e9d08f38390365ebbf2b2a5481a8e6093a86e6911bc41519ed3"} Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.878788 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zrcmf" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.880645 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" event={"ID":"ba8a6a03-e32a-4121-86e1-d856ddf7a73b","Type":"ContainerStarted","Data":"7980189e946d1931331cd080fb56ff14fd80761a7e28038936449a6a3b51ce3c"} Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.880665 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" event={"ID":"ba8a6a03-e32a-4121-86e1-d856ddf7a73b","Type":"ContainerStarted","Data":"1a36953a68e288a1ce0802063a89fd0b717cb006290f3ef643ee98debd8d1d95"} Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.880998 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.883972 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.894293 5116 scope.go:117] "RemoveContainer" containerID="30c022eef87348aae4e9bdbc424e5f6c1baa0356ea65b1994f9191806ffd90dd" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.915843 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" podStartSLOduration=1.915819884 podStartE2EDuration="1.915819884s" podCreationTimestamp="2026-03-22 00:13:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:13:53.907683387 +0000 UTC m=+304.929984760" watchObservedRunningTime="2026-03-22 00:13:53.915819884 +0000 UTC m=+304.938121257" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.963980 5116 scope.go:117] "RemoveContainer" containerID="60acf1b394737b7d397286a07410ebda0e8083a98b6b46d3e3761b9f5dd3c90c" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.970900 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kp7rb"] Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.988865 5116 scope.go:117] "RemoveContainer" containerID="a7f6a2282b3b24b75800d6fb74f2f5336f0fdfcc664abae1ceae752dc767bce9" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.995255 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kp7rb"] Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.007909 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wss9d"] Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.010436 5116 scope.go:117] "RemoveContainer" containerID="4615b1e7e8eebd1c4efa8ba0e4d690678c50216687ca96316a048717b240afa6" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.010496 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wss9d"] Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.014100 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zrcmf"] Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.017383 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zrcmf"] Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.021310 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-lf2zm"] Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.023022 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-lf2zm"] Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.030331 5116 scope.go:117] "RemoveContainer" containerID="f6ccc3cf8e5e1fac21a450937d818f73d9c8ea21d213cc087495650a551817ba" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.042752 5116 scope.go:117] "RemoveContainer" containerID="a871149953aa28693bec7b6f245c349dbf7ca02a0950b383ad44700084b4c923" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.059449 5116 scope.go:117] "RemoveContainer" containerID="58537fdac1ecd65bbab90d0d56679e194c7535455e823fc1086706266b101727" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.073658 5116 scope.go:117] "RemoveContainer" containerID="3662e71bd41b60c7bbef1f51273ae388448fc2e3a846e9f692b29bbba4929dce" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.087970 5116 scope.go:117] "RemoveContainer" containerID="650e2199c9bfdd8d8095c16f7e86df5e23a5cedd710ebf3a93a1b3818c4e5743" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.506546 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qtstp"] Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.507481 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da3b0eb3-e48f-4080-bfdc-522f18cf2876" containerName="registry-server" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.507531 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="da3b0eb3-e48f-4080-bfdc-522f18cf2876" containerName="registry-server" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.507545 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="696eed68-bf2d-4bbd-865f-07998d61f8ab" containerName="extract-utilities" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.507551 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="696eed68-bf2d-4bbd-865f-07998d61f8ab" containerName="extract-utilities" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.507561 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77380b82-4c44-4cfd-a7b1-e77b060af507" containerName="extract-content" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.507569 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="77380b82-4c44-4cfd-a7b1-e77b060af507" containerName="extract-content" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.507580 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23e39fb8-29b4-4a99-b189-3cd7c8e7f488" containerName="marketplace-operator" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.507587 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e39fb8-29b4-4a99-b189-3cd7c8e7f488" containerName="marketplace-operator" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.507599 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="696eed68-bf2d-4bbd-865f-07998d61f8ab" containerName="extract-content" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.507606 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="696eed68-bf2d-4bbd-865f-07998d61f8ab" containerName="extract-content" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.507907 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe41a890-8a59-4fc7-b392-b7bab2ad5832" containerName="extract-utilities" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.507919 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe41a890-8a59-4fc7-b392-b7bab2ad5832" containerName="extract-utilities" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.507929 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="696eed68-bf2d-4bbd-865f-07998d61f8ab" containerName="registry-server" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.507967 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="696eed68-bf2d-4bbd-865f-07998d61f8ab" containerName="registry-server" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508004 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23e39fb8-29b4-4a99-b189-3cd7c8e7f488" containerName="marketplace-operator" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508012 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e39fb8-29b4-4a99-b189-3cd7c8e7f488" containerName="marketplace-operator" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508024 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da3b0eb3-e48f-4080-bfdc-522f18cf2876" containerName="extract-content" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508031 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="da3b0eb3-e48f-4080-bfdc-522f18cf2876" containerName="extract-content" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508054 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77380b82-4c44-4cfd-a7b1-e77b060af507" containerName="extract-utilities" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508064 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="77380b82-4c44-4cfd-a7b1-e77b060af507" containerName="extract-utilities" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508075 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da3b0eb3-e48f-4080-bfdc-522f18cf2876" containerName="extract-utilities" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508084 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="da3b0eb3-e48f-4080-bfdc-522f18cf2876" containerName="extract-utilities" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508101 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe41a890-8a59-4fc7-b392-b7bab2ad5832" containerName="extract-content" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508108 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe41a890-8a59-4fc7-b392-b7bab2ad5832" containerName="extract-content" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508119 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77380b82-4c44-4cfd-a7b1-e77b060af507" containerName="registry-server" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508126 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="77380b82-4c44-4cfd-a7b1-e77b060af507" containerName="registry-server" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508139 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe41a890-8a59-4fc7-b392-b7bab2ad5832" containerName="registry-server" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508146 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe41a890-8a59-4fc7-b392-b7bab2ad5832" containerName="registry-server" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508276 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="77380b82-4c44-4cfd-a7b1-e77b060af507" containerName="registry-server" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508286 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="23e39fb8-29b4-4a99-b189-3cd7c8e7f488" containerName="marketplace-operator" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508299 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="da3b0eb3-e48f-4080-bfdc-522f18cf2876" containerName="registry-server" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508308 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="fe41a890-8a59-4fc7-b392-b7bab2ad5832" containerName="registry-server" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508314 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="696eed68-bf2d-4bbd-865f-07998d61f8ab" containerName="registry-server" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508529 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="23e39fb8-29b4-4a99-b189-3cd7c8e7f488" containerName="marketplace-operator" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.529816 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qtstp"] Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.530119 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qtstp" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.532655 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"certified-operators-dockercfg-7cl8d\"" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.660724 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc8wn\" (UniqueName: \"kubernetes.io/projected/fb40c619-b024-485e-8fab-590cf66159b3-kube-api-access-mc8wn\") pod \"certified-operators-qtstp\" (UID: \"fb40c619-b024-485e-8fab-590cf66159b3\") " pod="openshift-marketplace/certified-operators-qtstp" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.660831 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb40c619-b024-485e-8fab-590cf66159b3-catalog-content\") pod \"certified-operators-qtstp\" (UID: \"fb40c619-b024-485e-8fab-590cf66159b3\") " pod="openshift-marketplace/certified-operators-qtstp" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.660937 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb40c619-b024-485e-8fab-590cf66159b3-utilities\") pod \"certified-operators-qtstp\" (UID: \"fb40c619-b024-485e-8fab-590cf66159b3\") " pod="openshift-marketplace/certified-operators-qtstp" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.704026 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m2vjz"] Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.710403 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m2vjz" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.713857 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"redhat-marketplace-dockercfg-gg4w7\"" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.721652 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m2vjz"] Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.762138 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mc8wn\" (UniqueName: \"kubernetes.io/projected/fb40c619-b024-485e-8fab-590cf66159b3-kube-api-access-mc8wn\") pod \"certified-operators-qtstp\" (UID: \"fb40c619-b024-485e-8fab-590cf66159b3\") " pod="openshift-marketplace/certified-operators-qtstp" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.762231 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb40c619-b024-485e-8fab-590cf66159b3-catalog-content\") pod \"certified-operators-qtstp\" (UID: \"fb40c619-b024-485e-8fab-590cf66159b3\") " pod="openshift-marketplace/certified-operators-qtstp" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.762338 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb40c619-b024-485e-8fab-590cf66159b3-utilities\") pod \"certified-operators-qtstp\" (UID: \"fb40c619-b024-485e-8fab-590cf66159b3\") " pod="openshift-marketplace/certified-operators-qtstp" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.762848 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb40c619-b024-485e-8fab-590cf66159b3-catalog-content\") pod \"certified-operators-qtstp\" (UID: \"fb40c619-b024-485e-8fab-590cf66159b3\") " pod="openshift-marketplace/certified-operators-qtstp" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.762979 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb40c619-b024-485e-8fab-590cf66159b3-utilities\") pod \"certified-operators-qtstp\" (UID: \"fb40c619-b024-485e-8fab-590cf66159b3\") " pod="openshift-marketplace/certified-operators-qtstp" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.788651 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc8wn\" (UniqueName: \"kubernetes.io/projected/fb40c619-b024-485e-8fab-590cf66159b3-kube-api-access-mc8wn\") pod \"certified-operators-qtstp\" (UID: \"fb40c619-b024-485e-8fab-590cf66159b3\") " pod="openshift-marketplace/certified-operators-qtstp" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.849250 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qtstp" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.865558 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f62de57-b304-469a-ab77-b6796a6a482c-utilities\") pod \"redhat-marketplace-m2vjz\" (UID: \"4f62de57-b304-469a-ab77-b6796a6a482c\") " pod="openshift-marketplace/redhat-marketplace-m2vjz" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.865674 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwxf9\" (UniqueName: \"kubernetes.io/projected/4f62de57-b304-469a-ab77-b6796a6a482c-kube-api-access-xwxf9\") pod \"redhat-marketplace-m2vjz\" (UID: \"4f62de57-b304-469a-ab77-b6796a6a482c\") " pod="openshift-marketplace/redhat-marketplace-m2vjz" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.865791 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f62de57-b304-469a-ab77-b6796a6a482c-catalog-content\") pod \"redhat-marketplace-m2vjz\" (UID: \"4f62de57-b304-469a-ab77-b6796a6a482c\") " pod="openshift-marketplace/redhat-marketplace-m2vjz" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.966917 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f62de57-b304-469a-ab77-b6796a6a482c-utilities\") pod \"redhat-marketplace-m2vjz\" (UID: \"4f62de57-b304-469a-ab77-b6796a6a482c\") " pod="openshift-marketplace/redhat-marketplace-m2vjz" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.967000 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwxf9\" (UniqueName: \"kubernetes.io/projected/4f62de57-b304-469a-ab77-b6796a6a482c-kube-api-access-xwxf9\") pod \"redhat-marketplace-m2vjz\" (UID: \"4f62de57-b304-469a-ab77-b6796a6a482c\") " pod="openshift-marketplace/redhat-marketplace-m2vjz" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.967040 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f62de57-b304-469a-ab77-b6796a6a482c-catalog-content\") pod \"redhat-marketplace-m2vjz\" (UID: \"4f62de57-b304-469a-ab77-b6796a6a482c\") " pod="openshift-marketplace/redhat-marketplace-m2vjz" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.967565 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f62de57-b304-469a-ab77-b6796a6a482c-catalog-content\") pod \"redhat-marketplace-m2vjz\" (UID: \"4f62de57-b304-469a-ab77-b6796a6a482c\") " pod="openshift-marketplace/redhat-marketplace-m2vjz" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.967721 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f62de57-b304-469a-ab77-b6796a6a482c-utilities\") pod \"redhat-marketplace-m2vjz\" (UID: \"4f62de57-b304-469a-ab77-b6796a6a482c\") " pod="openshift-marketplace/redhat-marketplace-m2vjz" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.992249 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwxf9\" (UniqueName: \"kubernetes.io/projected/4f62de57-b304-469a-ab77-b6796a6a482c-kube-api-access-xwxf9\") pod \"redhat-marketplace-m2vjz\" (UID: \"4f62de57-b304-469a-ab77-b6796a6a482c\") " pod="openshift-marketplace/redhat-marketplace-m2vjz" Mar 22 00:13:55 crc kubenswrapper[5116]: I0322 00:13:55.036102 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m2vjz" Mar 22 00:13:55 crc kubenswrapper[5116]: I0322 00:13:55.281844 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qtstp"] Mar 22 00:13:55 crc kubenswrapper[5116]: I0322 00:13:55.424804 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m2vjz"] Mar 22 00:13:55 crc kubenswrapper[5116]: W0322 00:13:55.490070 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f62de57_b304_469a_ab77_b6796a6a482c.slice/crio-2bfb9b2cbc2bf9b208015b497f6aa3a527695246821234f87810cec830fcdda1 WatchSource:0}: Error finding container 2bfb9b2cbc2bf9b208015b497f6aa3a527695246821234f87810cec830fcdda1: Status 404 returned error can't find the container with id 2bfb9b2cbc2bf9b208015b497f6aa3a527695246821234f87810cec830fcdda1 Mar 22 00:13:55 crc kubenswrapper[5116]: I0322 00:13:55.711615 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23e39fb8-29b4-4a99-b189-3cd7c8e7f488" path="/var/lib/kubelet/pods/23e39fb8-29b4-4a99-b189-3cd7c8e7f488/volumes" Mar 22 00:13:55 crc kubenswrapper[5116]: I0322 00:13:55.712838 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="696eed68-bf2d-4bbd-865f-07998d61f8ab" path="/var/lib/kubelet/pods/696eed68-bf2d-4bbd-865f-07998d61f8ab/volumes" Mar 22 00:13:55 crc kubenswrapper[5116]: I0322 00:13:55.713662 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77380b82-4c44-4cfd-a7b1-e77b060af507" path="/var/lib/kubelet/pods/77380b82-4c44-4cfd-a7b1-e77b060af507/volumes" Mar 22 00:13:55 crc kubenswrapper[5116]: I0322 00:13:55.715065 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe41a890-8a59-4fc7-b392-b7bab2ad5832" path="/var/lib/kubelet/pods/fe41a890-8a59-4fc7-b392-b7bab2ad5832/volumes" Mar 22 00:13:55 crc kubenswrapper[5116]: I0322 00:13:55.902444 5116 generic.go:358] "Generic (PLEG): container finished" podID="4f62de57-b304-469a-ab77-b6796a6a482c" containerID="27e9af3227635826bdc0e57820bf83bfd970557bab506c2c6b63a6bb5dd29c2e" exitCode=0 Mar 22 00:13:55 crc kubenswrapper[5116]: I0322 00:13:55.902499 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m2vjz" event={"ID":"4f62de57-b304-469a-ab77-b6796a6a482c","Type":"ContainerDied","Data":"27e9af3227635826bdc0e57820bf83bfd970557bab506c2c6b63a6bb5dd29c2e"} Mar 22 00:13:55 crc kubenswrapper[5116]: I0322 00:13:55.902541 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m2vjz" event={"ID":"4f62de57-b304-469a-ab77-b6796a6a482c","Type":"ContainerStarted","Data":"2bfb9b2cbc2bf9b208015b497f6aa3a527695246821234f87810cec830fcdda1"} Mar 22 00:13:55 crc kubenswrapper[5116]: I0322 00:13:55.904027 5116 generic.go:358] "Generic (PLEG): container finished" podID="fb40c619-b024-485e-8fab-590cf66159b3" containerID="abad814a6663bed533637d502f221eb9c34b41414e0481ef65c9bc4512985738" exitCode=0 Mar 22 00:13:55 crc kubenswrapper[5116]: I0322 00:13:55.904115 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qtstp" event={"ID":"fb40c619-b024-485e-8fab-590cf66159b3","Type":"ContainerDied","Data":"abad814a6663bed533637d502f221eb9c34b41414e0481ef65c9bc4512985738"} Mar 22 00:13:55 crc kubenswrapper[5116]: I0322 00:13:55.904153 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qtstp" event={"ID":"fb40c619-b024-485e-8fab-590cf66159b3","Type":"ContainerStarted","Data":"1a8e889707bedaf7985da62aeb60a3f1b2bd347bc6adeb83f6a726745ca9eb2c"} Mar 22 00:13:56 crc kubenswrapper[5116]: I0322 00:13:56.929066 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v85n6"] Mar 22 00:13:56 crc kubenswrapper[5116]: I0322 00:13:56.934743 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v85n6" Mar 22 00:13:56 crc kubenswrapper[5116]: I0322 00:13:56.935570 5116 generic.go:358] "Generic (PLEG): container finished" podID="fb40c619-b024-485e-8fab-590cf66159b3" containerID="8433eb6d124456dab2dc0752c6f375f18adb082633723a7b3fa0268293c52692" exitCode=0 Mar 22 00:13:56 crc kubenswrapper[5116]: I0322 00:13:56.935701 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qtstp" event={"ID":"fb40c619-b024-485e-8fab-590cf66159b3","Type":"ContainerDied","Data":"8433eb6d124456dab2dc0752c6f375f18adb082633723a7b3fa0268293c52692"} Mar 22 00:13:56 crc kubenswrapper[5116]: I0322 00:13:56.939214 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v85n6"] Mar 22 00:13:56 crc kubenswrapper[5116]: I0322 00:13:56.942003 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"redhat-operators-dockercfg-9gxlh\"" Mar 22 00:13:56 crc kubenswrapper[5116]: I0322 00:13:56.942680 5116 generic.go:358] "Generic (PLEG): container finished" podID="4f62de57-b304-469a-ab77-b6796a6a482c" containerID="6173164482aab8a51a8f73c58ede47ca239c1ae1606dc490f2a8bb20ad689155" exitCode=0 Mar 22 00:13:56 crc kubenswrapper[5116]: I0322 00:13:56.942758 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m2vjz" event={"ID":"4f62de57-b304-469a-ab77-b6796a6a482c","Type":"ContainerDied","Data":"6173164482aab8a51a8f73c58ede47ca239c1ae1606dc490f2a8bb20ad689155"} Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.100235 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz6lx\" (UniqueName: \"kubernetes.io/projected/132f688e-74fb-4bbb-844a-a23467633e19-kube-api-access-kz6lx\") pod \"redhat-operators-v85n6\" (UID: \"132f688e-74fb-4bbb-844a-a23467633e19\") " pod="openshift-marketplace/redhat-operators-v85n6" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.100310 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/132f688e-74fb-4bbb-844a-a23467633e19-catalog-content\") pod \"redhat-operators-v85n6\" (UID: \"132f688e-74fb-4bbb-844a-a23467633e19\") " pod="openshift-marketplace/redhat-operators-v85n6" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.100399 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/132f688e-74fb-4bbb-844a-a23467633e19-utilities\") pod \"redhat-operators-v85n6\" (UID: \"132f688e-74fb-4bbb-844a-a23467633e19\") " pod="openshift-marketplace/redhat-operators-v85n6" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.102434 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wkk2b"] Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.113660 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wkk2b"] Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.113810 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wkk2b" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.116245 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"community-operators-dockercfg-vrd5f\"" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.201964 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kz6lx\" (UniqueName: \"kubernetes.io/projected/132f688e-74fb-4bbb-844a-a23467633e19-kube-api-access-kz6lx\") pod \"redhat-operators-v85n6\" (UID: \"132f688e-74fb-4bbb-844a-a23467633e19\") " pod="openshift-marketplace/redhat-operators-v85n6" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.202038 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/132f688e-74fb-4bbb-844a-a23467633e19-catalog-content\") pod \"redhat-operators-v85n6\" (UID: \"132f688e-74fb-4bbb-844a-a23467633e19\") " pod="openshift-marketplace/redhat-operators-v85n6" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.202202 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cc37111-4983-4dbc-a277-b77d2fc47508-utilities\") pod \"community-operators-wkk2b\" (UID: \"9cc37111-4983-4dbc-a277-b77d2fc47508\") " pod="openshift-marketplace/community-operators-wkk2b" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.202240 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/132f688e-74fb-4bbb-844a-a23467633e19-utilities\") pod \"redhat-operators-v85n6\" (UID: \"132f688e-74fb-4bbb-844a-a23467633e19\") " pod="openshift-marketplace/redhat-operators-v85n6" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.202263 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cc37111-4983-4dbc-a277-b77d2fc47508-catalog-content\") pod \"community-operators-wkk2b\" (UID: \"9cc37111-4983-4dbc-a277-b77d2fc47508\") " pod="openshift-marketplace/community-operators-wkk2b" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.202283 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhmcg\" (UniqueName: \"kubernetes.io/projected/9cc37111-4983-4dbc-a277-b77d2fc47508-kube-api-access-bhmcg\") pod \"community-operators-wkk2b\" (UID: \"9cc37111-4983-4dbc-a277-b77d2fc47508\") " pod="openshift-marketplace/community-operators-wkk2b" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.202637 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/132f688e-74fb-4bbb-844a-a23467633e19-catalog-content\") pod \"redhat-operators-v85n6\" (UID: \"132f688e-74fb-4bbb-844a-a23467633e19\") " pod="openshift-marketplace/redhat-operators-v85n6" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.202691 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/132f688e-74fb-4bbb-844a-a23467633e19-utilities\") pod \"redhat-operators-v85n6\" (UID: \"132f688e-74fb-4bbb-844a-a23467633e19\") " pod="openshift-marketplace/redhat-operators-v85n6" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.232528 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz6lx\" (UniqueName: \"kubernetes.io/projected/132f688e-74fb-4bbb-844a-a23467633e19-kube-api-access-kz6lx\") pod \"redhat-operators-v85n6\" (UID: \"132f688e-74fb-4bbb-844a-a23467633e19\") " pod="openshift-marketplace/redhat-operators-v85n6" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.270667 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v85n6" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.304113 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cc37111-4983-4dbc-a277-b77d2fc47508-utilities\") pod \"community-operators-wkk2b\" (UID: \"9cc37111-4983-4dbc-a277-b77d2fc47508\") " pod="openshift-marketplace/community-operators-wkk2b" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.304291 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cc37111-4983-4dbc-a277-b77d2fc47508-catalog-content\") pod \"community-operators-wkk2b\" (UID: \"9cc37111-4983-4dbc-a277-b77d2fc47508\") " pod="openshift-marketplace/community-operators-wkk2b" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.304338 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bhmcg\" (UniqueName: \"kubernetes.io/projected/9cc37111-4983-4dbc-a277-b77d2fc47508-kube-api-access-bhmcg\") pod \"community-operators-wkk2b\" (UID: \"9cc37111-4983-4dbc-a277-b77d2fc47508\") " pod="openshift-marketplace/community-operators-wkk2b" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.304838 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cc37111-4983-4dbc-a277-b77d2fc47508-catalog-content\") pod \"community-operators-wkk2b\" (UID: \"9cc37111-4983-4dbc-a277-b77d2fc47508\") " pod="openshift-marketplace/community-operators-wkk2b" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.304998 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cc37111-4983-4dbc-a277-b77d2fc47508-utilities\") pod \"community-operators-wkk2b\" (UID: \"9cc37111-4983-4dbc-a277-b77d2fc47508\") " pod="openshift-marketplace/community-operators-wkk2b" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.324304 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhmcg\" (UniqueName: \"kubernetes.io/projected/9cc37111-4983-4dbc-a277-b77d2fc47508-kube-api-access-bhmcg\") pod \"community-operators-wkk2b\" (UID: \"9cc37111-4983-4dbc-a277-b77d2fc47508\") " pod="openshift-marketplace/community-operators-wkk2b" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.429758 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wkk2b" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.704570 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v85n6"] Mar 22 00:13:57 crc kubenswrapper[5116]: W0322 00:13:57.709626 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod132f688e_74fb_4bbb_844a_a23467633e19.slice/crio-e66f236caa4417060890c2598757e744510d58e5dfe754d7d108f21dc07bf922 WatchSource:0}: Error finding container e66f236caa4417060890c2598757e744510d58e5dfe754d7d108f21dc07bf922: Status 404 returned error can't find the container with id e66f236caa4417060890c2598757e744510d58e5dfe754d7d108f21dc07bf922 Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.868124 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wkk2b"] Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.949661 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qtstp" event={"ID":"fb40c619-b024-485e-8fab-590cf66159b3","Type":"ContainerStarted","Data":"e608f034634a1d6831bf39293af4171fbdc6fab08158d6d732d8693f36437a64"} Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.953332 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wkk2b" event={"ID":"9cc37111-4983-4dbc-a277-b77d2fc47508","Type":"ContainerStarted","Data":"b6167cb8692fe7c05a23fa685b3f7c66b6b19d918c2beb0018d416ac5abf2906"} Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.955093 5116 generic.go:358] "Generic (PLEG): container finished" podID="132f688e-74fb-4bbb-844a-a23467633e19" containerID="cac52d5e77020597f48a778f9ab7cdf9f57f7963b547fca6b30fcdf32fb1af24" exitCode=0 Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.955358 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v85n6" event={"ID":"132f688e-74fb-4bbb-844a-a23467633e19","Type":"ContainerDied","Data":"cac52d5e77020597f48a778f9ab7cdf9f57f7963b547fca6b30fcdf32fb1af24"} Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.955479 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v85n6" event={"ID":"132f688e-74fb-4bbb-844a-a23467633e19","Type":"ContainerStarted","Data":"e66f236caa4417060890c2598757e744510d58e5dfe754d7d108f21dc07bf922"} Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.963429 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m2vjz" event={"ID":"4f62de57-b304-469a-ab77-b6796a6a482c","Type":"ContainerStarted","Data":"1d937792336588d7f4a3fd335a18f87c926bcb88e8eceb369806c30a1f306494"} Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.970459 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qtstp" podStartSLOduration=3.401257656 podStartE2EDuration="3.970438498s" podCreationTimestamp="2026-03-22 00:13:54 +0000 UTC" firstStartedPulling="2026-03-22 00:13:55.905234254 +0000 UTC m=+306.927535627" lastFinishedPulling="2026-03-22 00:13:56.474415056 +0000 UTC m=+307.496716469" observedRunningTime="2026-03-22 00:13:57.966053584 +0000 UTC m=+308.988354967" watchObservedRunningTime="2026-03-22 00:13:57.970438498 +0000 UTC m=+308.992739871" Mar 22 00:13:58 crc kubenswrapper[5116]: I0322 00:13:58.011062 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m2vjz" podStartSLOduration=3.444734259 podStartE2EDuration="4.011038131s" podCreationTimestamp="2026-03-22 00:13:54 +0000 UTC" firstStartedPulling="2026-03-22 00:13:55.903337806 +0000 UTC m=+306.925639179" lastFinishedPulling="2026-03-22 00:13:56.469641648 +0000 UTC m=+307.491943051" observedRunningTime="2026-03-22 00:13:58.007507997 +0000 UTC m=+309.029809370" watchObservedRunningTime="2026-03-22 00:13:58.011038131 +0000 UTC m=+309.033339504" Mar 22 00:13:58 crc kubenswrapper[5116]: I0322 00:13:58.972273 5116 generic.go:358] "Generic (PLEG): container finished" podID="9cc37111-4983-4dbc-a277-b77d2fc47508" containerID="1e43b85690d74b47388096377dda7ee7347944fba90172e40e429b79861427bf" exitCode=0 Mar 22 00:13:58 crc kubenswrapper[5116]: I0322 00:13:58.972847 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wkk2b" event={"ID":"9cc37111-4983-4dbc-a277-b77d2fc47508","Type":"ContainerDied","Data":"1e43b85690d74b47388096377dda7ee7347944fba90172e40e429b79861427bf"} Mar 22 00:13:58 crc kubenswrapper[5116]: I0322 00:13:58.985823 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v85n6" event={"ID":"132f688e-74fb-4bbb-844a-a23467633e19","Type":"ContainerStarted","Data":"e9911ca0330d44857fae6b7f5bdddf69e511cad69fd13be56d12e4bb32c4cd67"} Mar 22 00:13:59 crc kubenswrapper[5116]: I0322 00:13:59.991993 5116 generic.go:358] "Generic (PLEG): container finished" podID="132f688e-74fb-4bbb-844a-a23467633e19" containerID="e9911ca0330d44857fae6b7f5bdddf69e511cad69fd13be56d12e4bb32c4cd67" exitCode=0 Mar 22 00:13:59 crc kubenswrapper[5116]: I0322 00:13:59.992114 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v85n6" event={"ID":"132f688e-74fb-4bbb-844a-a23467633e19","Type":"ContainerDied","Data":"e9911ca0330d44857fae6b7f5bdddf69e511cad69fd13be56d12e4bb32c4cd67"} Mar 22 00:14:00 crc kubenswrapper[5116]: I0322 00:14:00.138568 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29568974-w8j5j"] Mar 22 00:14:00 crc kubenswrapper[5116]: I0322 00:14:00.160867 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568974-w8j5j"] Mar 22 00:14:00 crc kubenswrapper[5116]: I0322 00:14:00.161101 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568974-w8j5j" Mar 22 00:14:00 crc kubenswrapper[5116]: I0322 00:14:00.164182 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 22 00:14:00 crc kubenswrapper[5116]: I0322 00:14:00.164805 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 22 00:14:00 crc kubenswrapper[5116]: I0322 00:14:00.165430 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-zsw2q\"" Mar 22 00:14:00 crc kubenswrapper[5116]: I0322 00:14:00.240682 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85ph5\" (UniqueName: \"kubernetes.io/projected/3b94e50a-fe81-48fe-a23a-c15956c06d21-kube-api-access-85ph5\") pod \"auto-csr-approver-29568974-w8j5j\" (UID: \"3b94e50a-fe81-48fe-a23a-c15956c06d21\") " pod="openshift-infra/auto-csr-approver-29568974-w8j5j" Mar 22 00:14:00 crc kubenswrapper[5116]: I0322 00:14:00.341905 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85ph5\" (UniqueName: \"kubernetes.io/projected/3b94e50a-fe81-48fe-a23a-c15956c06d21-kube-api-access-85ph5\") pod \"auto-csr-approver-29568974-w8j5j\" (UID: \"3b94e50a-fe81-48fe-a23a-c15956c06d21\") " pod="openshift-infra/auto-csr-approver-29568974-w8j5j" Mar 22 00:14:00 crc kubenswrapper[5116]: I0322 00:14:00.366219 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-85ph5\" (UniqueName: \"kubernetes.io/projected/3b94e50a-fe81-48fe-a23a-c15956c06d21-kube-api-access-85ph5\") pod \"auto-csr-approver-29568974-w8j5j\" (UID: \"3b94e50a-fe81-48fe-a23a-c15956c06d21\") " pod="openshift-infra/auto-csr-approver-29568974-w8j5j" Mar 22 00:14:00 crc kubenswrapper[5116]: I0322 00:14:00.510521 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568974-w8j5j" Mar 22 00:14:00 crc kubenswrapper[5116]: I0322 00:14:00.893439 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568974-w8j5j"] Mar 22 00:14:01 crc kubenswrapper[5116]: I0322 00:14:01.000101 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568974-w8j5j" event={"ID":"3b94e50a-fe81-48fe-a23a-c15956c06d21","Type":"ContainerStarted","Data":"a83e43008d6fd84b636448e3d5d1fdd64918cee13e1036fbb41ee766e29a6fe0"} Mar 22 00:14:01 crc kubenswrapper[5116]: I0322 00:14:01.001814 5116 generic.go:358] "Generic (PLEG): container finished" podID="9cc37111-4983-4dbc-a277-b77d2fc47508" containerID="55f8617fd0750429feb258f5f415f3d00b17bc8699b50faa2e41ad61cc896c20" exitCode=0 Mar 22 00:14:01 crc kubenswrapper[5116]: I0322 00:14:01.001916 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wkk2b" event={"ID":"9cc37111-4983-4dbc-a277-b77d2fc47508","Type":"ContainerDied","Data":"55f8617fd0750429feb258f5f415f3d00b17bc8699b50faa2e41ad61cc896c20"} Mar 22 00:14:01 crc kubenswrapper[5116]: I0322 00:14:01.005180 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v85n6" event={"ID":"132f688e-74fb-4bbb-844a-a23467633e19","Type":"ContainerStarted","Data":"07725b0bced148bfeb9beb2ed062e98ce22b68516822b3b06a43b5e4bc108816"} Mar 22 00:14:01 crc kubenswrapper[5116]: I0322 00:14:01.043833 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v85n6" podStartSLOduration=4.257931239 podStartE2EDuration="5.043817752s" podCreationTimestamp="2026-03-22 00:13:56 +0000 UTC" firstStartedPulling="2026-03-22 00:13:57.956290558 +0000 UTC m=+308.978591941" lastFinishedPulling="2026-03-22 00:13:58.742177081 +0000 UTC m=+309.764478454" observedRunningTime="2026-03-22 00:14:01.040463624 +0000 UTC m=+312.062765007" watchObservedRunningTime="2026-03-22 00:14:01.043817752 +0000 UTC m=+312.066119125" Mar 22 00:14:02 crc kubenswrapper[5116]: I0322 00:14:02.015588 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wkk2b" event={"ID":"9cc37111-4983-4dbc-a277-b77d2fc47508","Type":"ContainerStarted","Data":"017282eaa72fb6d6afc670e6c507047b70540289ad8c6e1e984790f43486cbe4"} Mar 22 00:14:03 crc kubenswrapper[5116]: I0322 00:14:03.029601 5116 generic.go:358] "Generic (PLEG): container finished" podID="3b94e50a-fe81-48fe-a23a-c15956c06d21" containerID="2acebccbc85d9eff1c121aca735947ed6d77f0c1bc6b89aca01a5fc1d6de9f77" exitCode=0 Mar 22 00:14:03 crc kubenswrapper[5116]: I0322 00:14:03.029654 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568974-w8j5j" event={"ID":"3b94e50a-fe81-48fe-a23a-c15956c06d21","Type":"ContainerDied","Data":"2acebccbc85d9eff1c121aca735947ed6d77f0c1bc6b89aca01a5fc1d6de9f77"} Mar 22 00:14:03 crc kubenswrapper[5116]: I0322 00:14:03.044934 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wkk2b" podStartSLOduration=5.127768927 podStartE2EDuration="6.044918904s" podCreationTimestamp="2026-03-22 00:13:57 +0000 UTC" firstStartedPulling="2026-03-22 00:13:58.983060655 +0000 UTC m=+310.005362028" lastFinishedPulling="2026-03-22 00:13:59.900210632 +0000 UTC m=+310.922512005" observedRunningTime="2026-03-22 00:14:02.04626093 +0000 UTC m=+313.068562323" watchObservedRunningTime="2026-03-22 00:14:03.044918904 +0000 UTC m=+314.067220267" Mar 22 00:14:04 crc kubenswrapper[5116]: I0322 00:14:04.300709 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568974-w8j5j" Mar 22 00:14:04 crc kubenswrapper[5116]: I0322 00:14:04.391900 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85ph5\" (UniqueName: \"kubernetes.io/projected/3b94e50a-fe81-48fe-a23a-c15956c06d21-kube-api-access-85ph5\") pod \"3b94e50a-fe81-48fe-a23a-c15956c06d21\" (UID: \"3b94e50a-fe81-48fe-a23a-c15956c06d21\") " Mar 22 00:14:04 crc kubenswrapper[5116]: I0322 00:14:04.397477 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b94e50a-fe81-48fe-a23a-c15956c06d21-kube-api-access-85ph5" (OuterVolumeSpecName: "kube-api-access-85ph5") pod "3b94e50a-fe81-48fe-a23a-c15956c06d21" (UID: "3b94e50a-fe81-48fe-a23a-c15956c06d21"). InnerVolumeSpecName "kube-api-access-85ph5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:14:04 crc kubenswrapper[5116]: I0322 00:14:04.493771 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-85ph5\" (UniqueName: \"kubernetes.io/projected/3b94e50a-fe81-48fe-a23a-c15956c06d21-kube-api-access-85ph5\") on node \"crc\" DevicePath \"\"" Mar 22 00:14:04 crc kubenswrapper[5116]: I0322 00:14:04.850574 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qtstp" Mar 22 00:14:04 crc kubenswrapper[5116]: I0322 00:14:04.850637 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/certified-operators-qtstp" Mar 22 00:14:04 crc kubenswrapper[5116]: I0322 00:14:04.904133 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qtstp" Mar 22 00:14:05 crc kubenswrapper[5116]: I0322 00:14:05.037201 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m2vjz" Mar 22 00:14:05 crc kubenswrapper[5116]: I0322 00:14:05.038097 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-marketplace-m2vjz" Mar 22 00:14:05 crc kubenswrapper[5116]: I0322 00:14:05.043820 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568974-w8j5j" Mar 22 00:14:05 crc kubenswrapper[5116]: I0322 00:14:05.043818 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568974-w8j5j" event={"ID":"3b94e50a-fe81-48fe-a23a-c15956c06d21","Type":"ContainerDied","Data":"a83e43008d6fd84b636448e3d5d1fdd64918cee13e1036fbb41ee766e29a6fe0"} Mar 22 00:14:05 crc kubenswrapper[5116]: I0322 00:14:05.043869 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a83e43008d6fd84b636448e3d5d1fdd64918cee13e1036fbb41ee766e29a6fe0" Mar 22 00:14:05 crc kubenswrapper[5116]: I0322 00:14:05.077761 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qtstp" Mar 22 00:14:05 crc kubenswrapper[5116]: I0322 00:14:05.080194 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m2vjz" Mar 22 00:14:06 crc kubenswrapper[5116]: I0322 00:14:06.089193 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m2vjz" Mar 22 00:14:07 crc kubenswrapper[5116]: I0322 00:14:07.271646 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-operators-v85n6" Mar 22 00:14:07 crc kubenswrapper[5116]: I0322 00:14:07.271713 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v85n6" Mar 22 00:14:07 crc kubenswrapper[5116]: I0322 00:14:07.311239 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v85n6" Mar 22 00:14:07 crc kubenswrapper[5116]: I0322 00:14:07.430822 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wkk2b" Mar 22 00:14:07 crc kubenswrapper[5116]: I0322 00:14:07.430882 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/community-operators-wkk2b" Mar 22 00:14:07 crc kubenswrapper[5116]: I0322 00:14:07.480645 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wkk2b" Mar 22 00:14:08 crc kubenswrapper[5116]: I0322 00:14:08.094994 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wkk2b" Mar 22 00:14:08 crc kubenswrapper[5116]: I0322 00:14:08.115730 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v85n6" Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.139351 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9"] Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.140997 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3b94e50a-fe81-48fe-a23a-c15956c06d21" containerName="oc" Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.141018 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b94e50a-fe81-48fe-a23a-c15956c06d21" containerName="oc" Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.141207 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="3b94e50a-fe81-48fe-a23a-c15956c06d21" containerName="oc" Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.154957 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9"] Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.155335 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9" Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.160123 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-config\"" Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.160501 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-dockercfg-vfqp6\"" Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.229524 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svcw2\" (UniqueName: \"kubernetes.io/projected/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-kube-api-access-svcw2\") pod \"collect-profiles-29568975-txgb9\" (UID: \"cd79fdbb-2811-4c9c-a80b-ad21ccc89560\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9" Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.229603 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-secret-volume\") pod \"collect-profiles-29568975-txgb9\" (UID: \"cd79fdbb-2811-4c9c-a80b-ad21ccc89560\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9" Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.229844 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-config-volume\") pod \"collect-profiles-29568975-txgb9\" (UID: \"cd79fdbb-2811-4c9c-a80b-ad21ccc89560\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9" Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.331144 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-secret-volume\") pod \"collect-profiles-29568975-txgb9\" (UID: \"cd79fdbb-2811-4c9c-a80b-ad21ccc89560\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9" Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.331380 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-config-volume\") pod \"collect-profiles-29568975-txgb9\" (UID: \"cd79fdbb-2811-4c9c-a80b-ad21ccc89560\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9" Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.331479 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svcw2\" (UniqueName: \"kubernetes.io/projected/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-kube-api-access-svcw2\") pod \"collect-profiles-29568975-txgb9\" (UID: \"cd79fdbb-2811-4c9c-a80b-ad21ccc89560\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9" Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.332463 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-config-volume\") pod \"collect-profiles-29568975-txgb9\" (UID: \"cd79fdbb-2811-4c9c-a80b-ad21ccc89560\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9" Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.343521 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-secret-volume\") pod \"collect-profiles-29568975-txgb9\" (UID: \"cd79fdbb-2811-4c9c-a80b-ad21ccc89560\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9" Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.348579 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-svcw2\" (UniqueName: \"kubernetes.io/projected/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-kube-api-access-svcw2\") pod \"collect-profiles-29568975-txgb9\" (UID: \"cd79fdbb-2811-4c9c-a80b-ad21ccc89560\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9" Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.482281 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9" Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.892253 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9"] Mar 22 00:15:01 crc kubenswrapper[5116]: I0322 00:15:01.399580 5116 generic.go:358] "Generic (PLEG): container finished" podID="cd79fdbb-2811-4c9c-a80b-ad21ccc89560" containerID="fe8cd95451621929fe3a692f2551d46a8b780ef88806567692d43ff15530e87d" exitCode=0 Mar 22 00:15:01 crc kubenswrapper[5116]: I0322 00:15:01.399820 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9" event={"ID":"cd79fdbb-2811-4c9c-a80b-ad21ccc89560","Type":"ContainerDied","Data":"fe8cd95451621929fe3a692f2551d46a8b780ef88806567692d43ff15530e87d"} Mar 22 00:15:01 crc kubenswrapper[5116]: I0322 00:15:01.399856 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9" event={"ID":"cd79fdbb-2811-4c9c-a80b-ad21ccc89560","Type":"ContainerStarted","Data":"bc27613337a375500fc7577bd8bca5cde7fd28e3d075541abefdba644008eecc"} Mar 22 00:15:02 crc kubenswrapper[5116]: I0322 00:15:02.620914 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9" Mar 22 00:15:02 crc kubenswrapper[5116]: I0322 00:15:02.762474 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-secret-volume\") pod \"cd79fdbb-2811-4c9c-a80b-ad21ccc89560\" (UID: \"cd79fdbb-2811-4c9c-a80b-ad21ccc89560\") " Mar 22 00:15:02 crc kubenswrapper[5116]: I0322 00:15:02.762566 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-config-volume\") pod \"cd79fdbb-2811-4c9c-a80b-ad21ccc89560\" (UID: \"cd79fdbb-2811-4c9c-a80b-ad21ccc89560\") " Mar 22 00:15:02 crc kubenswrapper[5116]: I0322 00:15:02.762663 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svcw2\" (UniqueName: \"kubernetes.io/projected/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-kube-api-access-svcw2\") pod \"cd79fdbb-2811-4c9c-a80b-ad21ccc89560\" (UID: \"cd79fdbb-2811-4c9c-a80b-ad21ccc89560\") " Mar 22 00:15:02 crc kubenswrapper[5116]: I0322 00:15:02.763405 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-config-volume" (OuterVolumeSpecName: "config-volume") pod "cd79fdbb-2811-4c9c-a80b-ad21ccc89560" (UID: "cd79fdbb-2811-4c9c-a80b-ad21ccc89560"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:15:02 crc kubenswrapper[5116]: I0322 00:15:02.767783 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cd79fdbb-2811-4c9c-a80b-ad21ccc89560" (UID: "cd79fdbb-2811-4c9c-a80b-ad21ccc89560"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:15:02 crc kubenswrapper[5116]: I0322 00:15:02.768002 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-kube-api-access-svcw2" (OuterVolumeSpecName: "kube-api-access-svcw2") pod "cd79fdbb-2811-4c9c-a80b-ad21ccc89560" (UID: "cd79fdbb-2811-4c9c-a80b-ad21ccc89560"). InnerVolumeSpecName "kube-api-access-svcw2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:15:02 crc kubenswrapper[5116]: I0322 00:15:02.864614 5116 reconciler_common.go:299] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 22 00:15:02 crc kubenswrapper[5116]: I0322 00:15:02.864665 5116 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-config-volume\") on node \"crc\" DevicePath \"\"" Mar 22 00:15:02 crc kubenswrapper[5116]: I0322 00:15:02.864680 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-svcw2\" (UniqueName: \"kubernetes.io/projected/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-kube-api-access-svcw2\") on node \"crc\" DevicePath \"\"" Mar 22 00:15:03 crc kubenswrapper[5116]: I0322 00:15:03.414640 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9" Mar 22 00:15:03 crc kubenswrapper[5116]: I0322 00:15:03.414665 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9" event={"ID":"cd79fdbb-2811-4c9c-a80b-ad21ccc89560","Type":"ContainerDied","Data":"bc27613337a375500fc7577bd8bca5cde7fd28e3d075541abefdba644008eecc"} Mar 22 00:15:03 crc kubenswrapper[5116]: I0322 00:15:03.414704 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc27613337a375500fc7577bd8bca5cde7fd28e3d075541abefdba644008eecc" Mar 22 00:15:23 crc kubenswrapper[5116]: I0322 00:15:23.057716 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:15:23 crc kubenswrapper[5116]: I0322 00:15:23.058423 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:15:53 crc kubenswrapper[5116]: I0322 00:15:53.057107 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:15:53 crc kubenswrapper[5116]: I0322 00:15:53.058047 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:16:00 crc kubenswrapper[5116]: I0322 00:16:00.139072 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29568976-b4g9d"] Mar 22 00:16:00 crc kubenswrapper[5116]: I0322 00:16:00.141007 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd79fdbb-2811-4c9c-a80b-ad21ccc89560" containerName="collect-profiles" Mar 22 00:16:00 crc kubenswrapper[5116]: I0322 00:16:00.141026 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd79fdbb-2811-4c9c-a80b-ad21ccc89560" containerName="collect-profiles" Mar 22 00:16:00 crc kubenswrapper[5116]: I0322 00:16:00.141499 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="cd79fdbb-2811-4c9c-a80b-ad21ccc89560" containerName="collect-profiles" Mar 22 00:16:00 crc kubenswrapper[5116]: I0322 00:16:00.153654 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568976-b4g9d" Mar 22 00:16:00 crc kubenswrapper[5116]: I0322 00:16:00.156531 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 22 00:16:00 crc kubenswrapper[5116]: I0322 00:16:00.157077 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-zsw2q\"" Mar 22 00:16:00 crc kubenswrapper[5116]: I0322 00:16:00.157596 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 22 00:16:00 crc kubenswrapper[5116]: I0322 00:16:00.157843 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568976-b4g9d"] Mar 22 00:16:00 crc kubenswrapper[5116]: I0322 00:16:00.235902 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjtcw\" (UniqueName: \"kubernetes.io/projected/832c911e-4692-4912-8df4-880e98e4c2c1-kube-api-access-gjtcw\") pod \"auto-csr-approver-29568976-b4g9d\" (UID: \"832c911e-4692-4912-8df4-880e98e4c2c1\") " pod="openshift-infra/auto-csr-approver-29568976-b4g9d" Mar 22 00:16:00 crc kubenswrapper[5116]: I0322 00:16:00.337149 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjtcw\" (UniqueName: \"kubernetes.io/projected/832c911e-4692-4912-8df4-880e98e4c2c1-kube-api-access-gjtcw\") pod \"auto-csr-approver-29568976-b4g9d\" (UID: \"832c911e-4692-4912-8df4-880e98e4c2c1\") " pod="openshift-infra/auto-csr-approver-29568976-b4g9d" Mar 22 00:16:00 crc kubenswrapper[5116]: I0322 00:16:00.359019 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjtcw\" (UniqueName: \"kubernetes.io/projected/832c911e-4692-4912-8df4-880e98e4c2c1-kube-api-access-gjtcw\") pod \"auto-csr-approver-29568976-b4g9d\" (UID: \"832c911e-4692-4912-8df4-880e98e4c2c1\") " pod="openshift-infra/auto-csr-approver-29568976-b4g9d" Mar 22 00:16:00 crc kubenswrapper[5116]: I0322 00:16:00.477116 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568976-b4g9d" Mar 22 00:16:00 crc kubenswrapper[5116]: I0322 00:16:00.690403 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568976-b4g9d"] Mar 22 00:16:00 crc kubenswrapper[5116]: I0322 00:16:00.790456 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568976-b4g9d" event={"ID":"832c911e-4692-4912-8df4-880e98e4c2c1","Type":"ContainerStarted","Data":"d770bc2085ad293f09a4219b3201b776c0a1c8377f5e30755af76fa1942bd7c6"} Mar 22 00:16:02 crc kubenswrapper[5116]: I0322 00:16:02.807375 5116 generic.go:358] "Generic (PLEG): container finished" podID="832c911e-4692-4912-8df4-880e98e4c2c1" containerID="9cfe6ad0080f9bd011bb482561dcac74a7fc0e16adff6a8d4fce7c2e783aaf6b" exitCode=0 Mar 22 00:16:02 crc kubenswrapper[5116]: I0322 00:16:02.807501 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568976-b4g9d" event={"ID":"832c911e-4692-4912-8df4-880e98e4c2c1","Type":"ContainerDied","Data":"9cfe6ad0080f9bd011bb482561dcac74a7fc0e16adff6a8d4fce7c2e783aaf6b"} Mar 22 00:16:04 crc kubenswrapper[5116]: I0322 00:16:04.091103 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568976-b4g9d" Mar 22 00:16:04 crc kubenswrapper[5116]: I0322 00:16:04.184992 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjtcw\" (UniqueName: \"kubernetes.io/projected/832c911e-4692-4912-8df4-880e98e4c2c1-kube-api-access-gjtcw\") pod \"832c911e-4692-4912-8df4-880e98e4c2c1\" (UID: \"832c911e-4692-4912-8df4-880e98e4c2c1\") " Mar 22 00:16:04 crc kubenswrapper[5116]: I0322 00:16:04.192463 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/832c911e-4692-4912-8df4-880e98e4c2c1-kube-api-access-gjtcw" (OuterVolumeSpecName: "kube-api-access-gjtcw") pod "832c911e-4692-4912-8df4-880e98e4c2c1" (UID: "832c911e-4692-4912-8df4-880e98e4c2c1"). InnerVolumeSpecName "kube-api-access-gjtcw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:16:04 crc kubenswrapper[5116]: I0322 00:16:04.286940 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gjtcw\" (UniqueName: \"kubernetes.io/projected/832c911e-4692-4912-8df4-880e98e4c2c1-kube-api-access-gjtcw\") on node \"crc\" DevicePath \"\"" Mar 22 00:16:04 crc kubenswrapper[5116]: I0322 00:16:04.823685 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568976-b4g9d" event={"ID":"832c911e-4692-4912-8df4-880e98e4c2c1","Type":"ContainerDied","Data":"d770bc2085ad293f09a4219b3201b776c0a1c8377f5e30755af76fa1942bd7c6"} Mar 22 00:16:04 crc kubenswrapper[5116]: I0322 00:16:04.823763 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d770bc2085ad293f09a4219b3201b776c0a1c8377f5e30755af76fa1942bd7c6" Mar 22 00:16:04 crc kubenswrapper[5116]: I0322 00:16:04.823872 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568976-b4g9d" Mar 22 00:16:23 crc kubenswrapper[5116]: I0322 00:16:23.057069 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:16:23 crc kubenswrapper[5116]: I0322 00:16:23.057393 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:16:23 crc kubenswrapper[5116]: I0322 00:16:23.057437 5116 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:16:23 crc kubenswrapper[5116]: I0322 00:16:23.057897 5116 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b51ee51711afcb946b670f0d81e2401692bf98b75a88b05923f26883062fdb6e"} pod="openshift-machine-config-operator/machine-config-daemon-66g6d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 22 00:16:23 crc kubenswrapper[5116]: I0322 00:16:23.057960 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" containerID="cri-o://b51ee51711afcb946b670f0d81e2401692bf98b75a88b05923f26883062fdb6e" gracePeriod=600 Mar 22 00:16:23 crc kubenswrapper[5116]: I0322 00:16:23.962625 5116 generic.go:358] "Generic (PLEG): container finished" podID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerID="b51ee51711afcb946b670f0d81e2401692bf98b75a88b05923f26883062fdb6e" exitCode=0 Mar 22 00:16:23 crc kubenswrapper[5116]: I0322 00:16:23.962713 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerDied","Data":"b51ee51711afcb946b670f0d81e2401692bf98b75a88b05923f26883062fdb6e"} Mar 22 00:16:23 crc kubenswrapper[5116]: I0322 00:16:23.963055 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerStarted","Data":"29fe2544ad2992d4d45322cba6cd2af82b6ecb0aef833ef494e780ff8287385e"} Mar 22 00:16:23 crc kubenswrapper[5116]: I0322 00:16:23.963087 5116 scope.go:117] "RemoveContainer" containerID="a25cfaae8e3e082964edf4aaff4c07221b6f7fe72f8c4b2ecedb1fe877eab638" Mar 22 00:16:49 crc kubenswrapper[5116]: I0322 00:16:49.971644 5116 scope.go:117] "RemoveContainer" containerID="0eaa89fec503d9ea89bbf6737645ec15fbfea7b3c5aaafe399fe76d10fe522f5" Mar 22 00:18:00 crc kubenswrapper[5116]: I0322 00:18:00.137900 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29568978-vkbll"] Mar 22 00:18:00 crc kubenswrapper[5116]: I0322 00:18:00.139862 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="832c911e-4692-4912-8df4-880e98e4c2c1" containerName="oc" Mar 22 00:18:00 crc kubenswrapper[5116]: I0322 00:18:00.139892 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="832c911e-4692-4912-8df4-880e98e4c2c1" containerName="oc" Mar 22 00:18:00 crc kubenswrapper[5116]: I0322 00:18:00.140706 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="832c911e-4692-4912-8df4-880e98e4c2c1" containerName="oc" Mar 22 00:18:00 crc kubenswrapper[5116]: I0322 00:18:00.149360 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568978-vkbll"] Mar 22 00:18:00 crc kubenswrapper[5116]: I0322 00:18:00.149552 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568978-vkbll" Mar 22 00:18:00 crc kubenswrapper[5116]: I0322 00:18:00.154954 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-zsw2q\"" Mar 22 00:18:00 crc kubenswrapper[5116]: I0322 00:18:00.155543 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 22 00:18:00 crc kubenswrapper[5116]: I0322 00:18:00.155761 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 22 00:18:00 crc kubenswrapper[5116]: I0322 00:18:00.322334 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5md2\" (UniqueName: \"kubernetes.io/projected/07ada1f6-f713-45cb-8230-b9a2d89878ab-kube-api-access-d5md2\") pod \"auto-csr-approver-29568978-vkbll\" (UID: \"07ada1f6-f713-45cb-8230-b9a2d89878ab\") " pod="openshift-infra/auto-csr-approver-29568978-vkbll" Mar 22 00:18:00 crc kubenswrapper[5116]: I0322 00:18:00.423412 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5md2\" (UniqueName: \"kubernetes.io/projected/07ada1f6-f713-45cb-8230-b9a2d89878ab-kube-api-access-d5md2\") pod \"auto-csr-approver-29568978-vkbll\" (UID: \"07ada1f6-f713-45cb-8230-b9a2d89878ab\") " pod="openshift-infra/auto-csr-approver-29568978-vkbll" Mar 22 00:18:00 crc kubenswrapper[5116]: I0322 00:18:00.461827 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5md2\" (UniqueName: \"kubernetes.io/projected/07ada1f6-f713-45cb-8230-b9a2d89878ab-kube-api-access-d5md2\") pod \"auto-csr-approver-29568978-vkbll\" (UID: \"07ada1f6-f713-45cb-8230-b9a2d89878ab\") " pod="openshift-infra/auto-csr-approver-29568978-vkbll" Mar 22 00:18:00 crc kubenswrapper[5116]: I0322 00:18:00.478603 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568978-vkbll" Mar 22 00:18:00 crc kubenswrapper[5116]: I0322 00:18:00.670505 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568978-vkbll"] Mar 22 00:18:01 crc kubenswrapper[5116]: I0322 00:18:01.558898 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568978-vkbll" event={"ID":"07ada1f6-f713-45cb-8230-b9a2d89878ab","Type":"ContainerStarted","Data":"efda434b38a357545c0d29d28142fb26e6450f11a2171b5254593e5038046112"} Mar 22 00:18:02 crc kubenswrapper[5116]: I0322 00:18:02.566348 5116 generic.go:358] "Generic (PLEG): container finished" podID="07ada1f6-f713-45cb-8230-b9a2d89878ab" containerID="208d35041a700bcc47fefb636464fe18464c55b6addf5e55a5a1888e5fa3efb2" exitCode=0 Mar 22 00:18:02 crc kubenswrapper[5116]: I0322 00:18:02.566471 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568978-vkbll" event={"ID":"07ada1f6-f713-45cb-8230-b9a2d89878ab","Type":"ContainerDied","Data":"208d35041a700bcc47fefb636464fe18464c55b6addf5e55a5a1888e5fa3efb2"} Mar 22 00:18:03 crc kubenswrapper[5116]: I0322 00:18:03.811430 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568978-vkbll" Mar 22 00:18:03 crc kubenswrapper[5116]: I0322 00:18:03.973625 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5md2\" (UniqueName: \"kubernetes.io/projected/07ada1f6-f713-45cb-8230-b9a2d89878ab-kube-api-access-d5md2\") pod \"07ada1f6-f713-45cb-8230-b9a2d89878ab\" (UID: \"07ada1f6-f713-45cb-8230-b9a2d89878ab\") " Mar 22 00:18:03 crc kubenswrapper[5116]: I0322 00:18:03.984569 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07ada1f6-f713-45cb-8230-b9a2d89878ab-kube-api-access-d5md2" (OuterVolumeSpecName: "kube-api-access-d5md2") pod "07ada1f6-f713-45cb-8230-b9a2d89878ab" (UID: "07ada1f6-f713-45cb-8230-b9a2d89878ab"). InnerVolumeSpecName "kube-api-access-d5md2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:18:04 crc kubenswrapper[5116]: I0322 00:18:04.076327 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d5md2\" (UniqueName: \"kubernetes.io/projected/07ada1f6-f713-45cb-8230-b9a2d89878ab-kube-api-access-d5md2\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:04 crc kubenswrapper[5116]: I0322 00:18:04.581891 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568978-vkbll" event={"ID":"07ada1f6-f713-45cb-8230-b9a2d89878ab","Type":"ContainerDied","Data":"efda434b38a357545c0d29d28142fb26e6450f11a2171b5254593e5038046112"} Mar 22 00:18:04 crc kubenswrapper[5116]: I0322 00:18:04.582287 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efda434b38a357545c0d29d28142fb26e6450f11a2171b5254593e5038046112" Mar 22 00:18:04 crc kubenswrapper[5116]: I0322 00:18:04.581912 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568978-vkbll" Mar 22 00:18:04 crc kubenswrapper[5116]: I0322 00:18:04.877396 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29568972-5s86m"] Mar 22 00:18:04 crc kubenswrapper[5116]: I0322 00:18:04.882484 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29568972-5s86m"] Mar 22 00:18:05 crc kubenswrapper[5116]: I0322 00:18:05.703643 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="907ec022-a4e4-4d33-8329-52c9bbb71520" path="/var/lib/kubelet/pods/907ec022-a4e4-4d33-8329-52c9bbb71520/volumes" Mar 22 00:18:23 crc kubenswrapper[5116]: I0322 00:18:23.056770 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:18:23 crc kubenswrapper[5116]: I0322 00:18:23.057392 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:18:49 crc kubenswrapper[5116]: I0322 00:18:49.904888 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 22 00:18:49 crc kubenswrapper[5116]: I0322 00:18:49.915259 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 22 00:18:50 crc kubenswrapper[5116]: I0322 00:18:50.027583 5116 scope.go:117] "RemoveContainer" containerID="5f050f299176f9b417ea910e3bb8affec9c6d4bf35a6de76d0aa5ed0d88ddf0f" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.057401 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.057810 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.417591 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4"] Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.417932 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" podUID="e17ab744-68a7-4a24-8ef2-556696d752fb" containerName="kube-rbac-proxy" containerID="cri-o://6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7" gracePeriod=30 Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.418081 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" podUID="e17ab744-68a7-4a24-8ef2-556696d752fb" containerName="ovnkube-cluster-manager" containerID="cri-o://ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8" gracePeriod=30 Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.616380 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.629870 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-n9zvq"] Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.630499 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="ovn-controller" containerID="cri-o://75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d" gracePeriod=30 Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.630547 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="kube-rbac-proxy-node" containerID="cri-o://75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93" gracePeriod=30 Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.630640 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="ovn-acl-logging" containerID="cri-o://03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519" gracePeriod=30 Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.630665 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62" gracePeriod=30 Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.630802 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="sbdb" containerID="cri-o://d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325" gracePeriod=30 Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.630914 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="nbdb" containerID="cri-o://c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85" gracePeriod=30 Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.632109 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="northd" containerID="cri-o://8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f" gracePeriod=30 Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.649054 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk"] Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.649822 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07ada1f6-f713-45cb-8230-b9a2d89878ab" containerName="oc" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.649847 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="07ada1f6-f713-45cb-8230-b9a2d89878ab" containerName="oc" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.649858 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e17ab744-68a7-4a24-8ef2-556696d752fb" containerName="ovnkube-cluster-manager" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.649866 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="e17ab744-68a7-4a24-8ef2-556696d752fb" containerName="ovnkube-cluster-manager" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.649906 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e17ab744-68a7-4a24-8ef2-556696d752fb" containerName="kube-rbac-proxy" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.649915 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="e17ab744-68a7-4a24-8ef2-556696d752fb" containerName="kube-rbac-proxy" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.650046 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="e17ab744-68a7-4a24-8ef2-556696d752fb" containerName="ovnkube-cluster-manager" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.650062 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="e17ab744-68a7-4a24-8ef2-556696d752fb" containerName="kube-rbac-proxy" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.650071 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="07ada1f6-f713-45cb-8230-b9a2d89878ab" containerName="oc" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.653498 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.671853 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="ovnkube-controller" containerID="cri-o://b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740" gracePeriod=30 Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.736777 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e17ab744-68a7-4a24-8ef2-556696d752fb-env-overrides\") pod \"e17ab744-68a7-4a24-8ef2-556696d752fb\" (UID: \"e17ab744-68a7-4a24-8ef2-556696d752fb\") " Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.736887 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e17ab744-68a7-4a24-8ef2-556696d752fb-ovn-control-plane-metrics-cert\") pod \"e17ab744-68a7-4a24-8ef2-556696d752fb\" (UID: \"e17ab744-68a7-4a24-8ef2-556696d752fb\") " Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.737025 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntdv4\" (UniqueName: \"kubernetes.io/projected/e17ab744-68a7-4a24-8ef2-556696d752fb-kube-api-access-ntdv4\") pod \"e17ab744-68a7-4a24-8ef2-556696d752fb\" (UID: \"e17ab744-68a7-4a24-8ef2-556696d752fb\") " Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.737056 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e17ab744-68a7-4a24-8ef2-556696d752fb-ovnkube-config\") pod \"e17ab744-68a7-4a24-8ef2-556696d752fb\" (UID: \"e17ab744-68a7-4a24-8ef2-556696d752fb\") " Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.738201 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e17ab744-68a7-4a24-8ef2-556696d752fb-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "e17ab744-68a7-4a24-8ef2-556696d752fb" (UID: "e17ab744-68a7-4a24-8ef2-556696d752fb"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.738845 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e17ab744-68a7-4a24-8ef2-556696d752fb-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "e17ab744-68a7-4a24-8ef2-556696d752fb" (UID: "e17ab744-68a7-4a24-8ef2-556696d752fb"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.752299 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e17ab744-68a7-4a24-8ef2-556696d752fb-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "e17ab744-68a7-4a24-8ef2-556696d752fb" (UID: "e17ab744-68a7-4a24-8ef2-556696d752fb"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.752398 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e17ab744-68a7-4a24-8ef2-556696d752fb-kube-api-access-ntdv4" (OuterVolumeSpecName: "kube-api-access-ntdv4") pod "e17ab744-68a7-4a24-8ef2-556696d752fb" (UID: "e17ab744-68a7-4a24-8ef2-556696d752fb"). InnerVolumeSpecName "kube-api-access-ntdv4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.838446 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d9a45017-2c2a-4fa3-9277-4d1d8b674faf-env-overrides\") pod \"ovnkube-control-plane-97c9b6c48-l9nvk\" (UID: \"d9a45017-2c2a-4fa3-9277-4d1d8b674faf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.838520 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d9a45017-2c2a-4fa3-9277-4d1d8b674faf-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-97c9b6c48-l9nvk\" (UID: \"d9a45017-2c2a-4fa3-9277-4d1d8b674faf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.838550 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d9a45017-2c2a-4fa3-9277-4d1d8b674faf-ovnkube-config\") pod \"ovnkube-control-plane-97c9b6c48-l9nvk\" (UID: \"d9a45017-2c2a-4fa3-9277-4d1d8b674faf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.838622 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cd6p\" (UniqueName: \"kubernetes.io/projected/d9a45017-2c2a-4fa3-9277-4d1d8b674faf-kube-api-access-8cd6p\") pod \"ovnkube-control-plane-97c9b6c48-l9nvk\" (UID: \"d9a45017-2c2a-4fa3-9277-4d1d8b674faf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.838742 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ntdv4\" (UniqueName: \"kubernetes.io/projected/e17ab744-68a7-4a24-8ef2-556696d752fb-kube-api-access-ntdv4\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.838758 5116 reconciler_common.go:299] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e17ab744-68a7-4a24-8ef2-556696d752fb-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.838769 5116 reconciler_common.go:299] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e17ab744-68a7-4a24-8ef2-556696d752fb-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.838782 5116 reconciler_common.go:299] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e17ab744-68a7-4a24-8ef2-556696d752fb-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.939583 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d9a45017-2c2a-4fa3-9277-4d1d8b674faf-env-overrides\") pod \"ovnkube-control-plane-97c9b6c48-l9nvk\" (UID: \"d9a45017-2c2a-4fa3-9277-4d1d8b674faf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.939645 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d9a45017-2c2a-4fa3-9277-4d1d8b674faf-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-97c9b6c48-l9nvk\" (UID: \"d9a45017-2c2a-4fa3-9277-4d1d8b674faf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.939664 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d9a45017-2c2a-4fa3-9277-4d1d8b674faf-ovnkube-config\") pod \"ovnkube-control-plane-97c9b6c48-l9nvk\" (UID: \"d9a45017-2c2a-4fa3-9277-4d1d8b674faf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.939701 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8cd6p\" (UniqueName: \"kubernetes.io/projected/d9a45017-2c2a-4fa3-9277-4d1d8b674faf-kube-api-access-8cd6p\") pod \"ovnkube-control-plane-97c9b6c48-l9nvk\" (UID: \"d9a45017-2c2a-4fa3-9277-4d1d8b674faf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.940275 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d9a45017-2c2a-4fa3-9277-4d1d8b674faf-env-overrides\") pod \"ovnkube-control-plane-97c9b6c48-l9nvk\" (UID: \"d9a45017-2c2a-4fa3-9277-4d1d8b674faf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.940443 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d9a45017-2c2a-4fa3-9277-4d1d8b674faf-ovnkube-config\") pod \"ovnkube-control-plane-97c9b6c48-l9nvk\" (UID: \"d9a45017-2c2a-4fa3-9277-4d1d8b674faf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.944537 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d9a45017-2c2a-4fa3-9277-4d1d8b674faf-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-97c9b6c48-l9nvk\" (UID: \"d9a45017-2c2a-4fa3-9277-4d1d8b674faf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.955981 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cd6p\" (UniqueName: \"kubernetes.io/projected/d9a45017-2c2a-4fa3-9277-4d1d8b674faf-kube-api-access-8cd6p\") pod \"ovnkube-control-plane-97c9b6c48-l9nvk\" (UID: \"d9a45017-2c2a-4fa3-9277-4d1d8b674faf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.967278 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n9zvq_ec484e57-1508-45a3-99a3-51dfa8ef6195/ovn-acl-logging/0.log" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.967755 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n9zvq_ec484e57-1508-45a3-99a3-51dfa8ef6195/ovn-controller/0.log" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.968236 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.989265 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.035619 5116 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.038059 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9sq6c_5188f25b-37c3-46f1-b939-199c6e082848/kube-multus/0.log" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.038233 5116 generic.go:358] "Generic (PLEG): container finished" podID="5188f25b-37c3-46f1-b939-199c6e082848" containerID="15ed776a73c12ffc79727de77156edc740c2234810a94e17ee8fc99d259db9c0" exitCode=2 Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.038501 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9sq6c" event={"ID":"5188f25b-37c3-46f1-b939-199c6e082848","Type":"ContainerDied","Data":"15ed776a73c12ffc79727de77156edc740c2234810a94e17ee8fc99d259db9c0"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.039991 5116 scope.go:117] "RemoveContainer" containerID="15ed776a73c12ffc79727de77156edc740c2234810a94e17ee8fc99d259db9c0" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.040400 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rsw9b"] Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.041411 5116 generic.go:358] "Generic (PLEG): container finished" podID="e17ab744-68a7-4a24-8ef2-556696d752fb" containerID="ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8" exitCode=0 Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.041455 5116 generic.go:358] "Generic (PLEG): container finished" podID="e17ab744-68a7-4a24-8ef2-556696d752fb" containerID="6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7" exitCode=0 Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.041758 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="kube-rbac-proxy-ovn-metrics" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.041779 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="kube-rbac-proxy-ovn-metrics" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.041798 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="northd" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.041808 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="northd" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.041820 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="ovn-controller" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.041827 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="ovn-controller" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.041853 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.041860 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="sbdb" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.041939 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="sbdb" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.041952 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="ovn-acl-logging" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.041976 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="ovn-acl-logging" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.041992 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="nbdb" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.041998 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="nbdb" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.042017 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="ovnkube-controller" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.042022 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="ovnkube-controller" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.042034 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="kube-rbac-proxy-node" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.042066 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="kube-rbac-proxy-node" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.042088 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="kubecfg-setup" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.042095 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="kubecfg-setup" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.042349 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="northd" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.042371 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="kube-rbac-proxy-ovn-metrics" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.042381 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="ovn-controller" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.042396 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="ovn-acl-logging" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.042414 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="ovnkube-controller" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.042430 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="sbdb" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.042444 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="kube-rbac-proxy-node" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.042453 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="nbdb" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.048706 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n9zvq_ec484e57-1508-45a3-99a3-51dfa8ef6195/ovn-acl-logging/0.log" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.051101 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n9zvq_ec484e57-1508-45a3-99a3-51dfa8ef6195/ovn-controller/0.log" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.051902 5116 generic.go:358] "Generic (PLEG): container finished" podID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerID="b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740" exitCode=0 Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.051935 5116 generic.go:358] "Generic (PLEG): container finished" podID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerID="d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325" exitCode=0 Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.051948 5116 generic.go:358] "Generic (PLEG): container finished" podID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerID="c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85" exitCode=0 Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.051958 5116 generic.go:358] "Generic (PLEG): container finished" podID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerID="8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f" exitCode=0 Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.051968 5116 generic.go:358] "Generic (PLEG): container finished" podID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerID="26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62" exitCode=0 Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.051977 5116 generic.go:358] "Generic (PLEG): container finished" podID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerID="75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93" exitCode=0 Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.051986 5116 generic.go:358] "Generic (PLEG): container finished" podID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerID="03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519" exitCode=143 Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.052000 5116 generic.go:358] "Generic (PLEG): container finished" podID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerID="75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d" exitCode=143 Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.052444 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059285 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" event={"ID":"e17ab744-68a7-4a24-8ef2-556696d752fb","Type":"ContainerDied","Data":"ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059427 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" event={"ID":"e17ab744-68a7-4a24-8ef2-556696d752fb","Type":"ContainerDied","Data":"6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059443 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" event={"ID":"e17ab744-68a7-4a24-8ef2-556696d752fb","Type":"ContainerDied","Data":"7eadfb4600290cb56b95da12f03d4c885e0344117c4889ec529ea4aaac7dd7ce"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059457 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerDied","Data":"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059474 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerDied","Data":"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059486 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerDied","Data":"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059499 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerDied","Data":"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059513 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerDied","Data":"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059525 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerDied","Data":"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059540 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059556 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059562 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059569 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059575 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059581 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059587 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059594 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059589 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059626 5116 scope.go:117] "RemoveContainer" containerID="ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059600 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059957 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerDied","Data":"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060028 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060040 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060202 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060214 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060220 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060227 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060233 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060239 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060245 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060254 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerDied","Data":"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060288 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060300 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060305 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060309 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060314 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060319 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060323 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060328 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060333 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060340 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerDied","Data":"43ef6e49a2566cb4645e9989226efbcb7eed21688ac9c6361a1e719ed88b50a9"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060371 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060377 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060381 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060386 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060393 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060398 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060402 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060407 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060411 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.082811 5116 scope.go:117] "RemoveContainer" containerID="6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.112490 5116 scope.go:117] "RemoveContainer" containerID="ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8" Mar 22 00:18:54 crc kubenswrapper[5116]: E0322 00:18:54.113102 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8\": container with ID starting with ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8 not found: ID does not exist" containerID="ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.113160 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8"} err="failed to get container status \"ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8\": rpc error: code = NotFound desc = could not find container \"ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8\": container with ID starting with ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.113211 5116 scope.go:117] "RemoveContainer" containerID="6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7" Mar 22 00:18:54 crc kubenswrapper[5116]: E0322 00:18:54.113759 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7\": container with ID starting with 6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7 not found: ID does not exist" containerID="6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.113800 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7"} err="failed to get container status \"6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7\": rpc error: code = NotFound desc = could not find container \"6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7\": container with ID starting with 6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.113823 5116 scope.go:117] "RemoveContainer" containerID="ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.114365 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8"} err="failed to get container status \"ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8\": rpc error: code = NotFound desc = could not find container \"ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8\": container with ID starting with ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.114402 5116 scope.go:117] "RemoveContainer" containerID="6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.116095 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4"] Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.116433 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7"} err="failed to get container status \"6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7\": rpc error: code = NotFound desc = could not find container \"6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7\": container with ID starting with 6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.116462 5116 scope.go:117] "RemoveContainer" containerID="b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.121467 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4"] Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.142982 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-env-overrides\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143030 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qp9r\" (UniqueName: \"kubernetes.io/projected/ec484e57-1508-45a3-99a3-51dfa8ef6195-kube-api-access-8qp9r\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143057 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-run-ovn-kubernetes\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143087 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovn-node-metrics-cert\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143102 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovnkube-script-lib\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143129 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-systemd-units\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143157 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovnkube-config\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143252 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-openvswitch\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143281 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-log-socket\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143313 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-slash\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143352 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-node-log\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143375 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-cni-bin\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143394 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-var-lib-openvswitch\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143408 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143449 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-systemd\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143461 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-etc-openvswitch\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143493 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-run-netns\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143517 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-kubelet\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143561 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-cni-netd\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143620 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-ovn\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143810 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-node-log" (OuterVolumeSpecName: "node-log") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143979 5116 reconciler_common.go:299] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-node-log\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143987 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.144062 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.144096 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.144113 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.144123 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.144154 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.144193 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.144226 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.144249 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.144639 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-log-socket" (OuterVolumeSpecName: "log-socket") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.144683 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.144711 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-slash" (OuterVolumeSpecName: "host-slash") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.144836 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.145240 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.145262 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.145287 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.146242 5116 scope.go:117] "RemoveContainer" containerID="d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.148430 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.148891 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec484e57-1508-45a3-99a3-51dfa8ef6195-kube-api-access-8qp9r" (OuterVolumeSpecName: "kube-api-access-8qp9r") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "kube-api-access-8qp9r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.158316 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.173283 5116 scope.go:117] "RemoveContainer" containerID="c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.190248 5116 scope.go:117] "RemoveContainer" containerID="8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.208640 5116 scope.go:117] "RemoveContainer" containerID="26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.234800 5116 scope.go:117] "RemoveContainer" containerID="75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.244851 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-systemd-units\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.244899 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-env-overrides\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.244924 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-run-ovn\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.244946 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-etc-openvswitch\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.244974 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.244997 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-var-lib-openvswitch\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245019 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-ovn-node-metrics-cert\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245045 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-run-openvswitch\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245067 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-kubelet\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245089 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-run-netns\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245107 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-run-systemd\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245144 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-cni-netd\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245160 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-node-log\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245200 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-slash\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245223 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zx6j\" (UniqueName: \"kubernetes.io/projected/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-kube-api-access-2zx6j\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245264 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-run-ovn-kubernetes\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245284 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-ovnkube-script-lib\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245310 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-cni-bin\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245333 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-log-socket\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245358 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-ovnkube-config\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245423 5116 reconciler_common.go:299] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245434 5116 reconciler_common.go:299] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-log-socket\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245444 5116 reconciler_common.go:299] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-slash\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245455 5116 reconciler_common.go:299] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245465 5116 reconciler_common.go:299] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245476 5116 reconciler_common.go:299] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245487 5116 reconciler_common.go:299] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245497 5116 reconciler_common.go:299] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245508 5116 reconciler_common.go:299] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245519 5116 reconciler_common.go:299] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245529 5116 reconciler_common.go:299] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245539 5116 reconciler_common.go:299] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245548 5116 reconciler_common.go:299] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245558 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8qp9r\" (UniqueName: \"kubernetes.io/projected/ec484e57-1508-45a3-99a3-51dfa8ef6195-kube-api-access-8qp9r\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245568 5116 reconciler_common.go:299] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.246863 5116 reconciler_common.go:299] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.247158 5116 reconciler_common.go:299] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.247188 5116 reconciler_common.go:299] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.247200 5116 reconciler_common.go:299] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.254053 5116 scope.go:117] "RemoveContainer" containerID="03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.270396 5116 scope.go:117] "RemoveContainer" containerID="75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.301387 5116 scope.go:117] "RemoveContainer" containerID="f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.317063 5116 scope.go:117] "RemoveContainer" containerID="b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740" Mar 22 00:18:54 crc kubenswrapper[5116]: E0322 00:18:54.317519 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740\": container with ID starting with b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740 not found: ID does not exist" containerID="b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.317552 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740"} err="failed to get container status \"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740\": rpc error: code = NotFound desc = could not find container \"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740\": container with ID starting with b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.317579 5116 scope.go:117] "RemoveContainer" containerID="d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325" Mar 22 00:18:54 crc kubenswrapper[5116]: E0322 00:18:54.317793 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325\": container with ID starting with d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325 not found: ID does not exist" containerID="d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.317821 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325"} err="failed to get container status \"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325\": rpc error: code = NotFound desc = could not find container \"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325\": container with ID starting with d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.317840 5116 scope.go:117] "RemoveContainer" containerID="c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85" Mar 22 00:18:54 crc kubenswrapper[5116]: E0322 00:18:54.318241 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85\": container with ID starting with c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85 not found: ID does not exist" containerID="c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.318308 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85"} err="failed to get container status \"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85\": rpc error: code = NotFound desc = could not find container \"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85\": container with ID starting with c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.318356 5116 scope.go:117] "RemoveContainer" containerID="8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f" Mar 22 00:18:54 crc kubenswrapper[5116]: E0322 00:18:54.318813 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f\": container with ID starting with 8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f not found: ID does not exist" containerID="8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.318847 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f"} err="failed to get container status \"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f\": rpc error: code = NotFound desc = could not find container \"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f\": container with ID starting with 8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.318868 5116 scope.go:117] "RemoveContainer" containerID="26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62" Mar 22 00:18:54 crc kubenswrapper[5116]: E0322 00:18:54.319217 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62\": container with ID starting with 26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62 not found: ID does not exist" containerID="26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.319250 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62"} err="failed to get container status \"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62\": rpc error: code = NotFound desc = could not find container \"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62\": container with ID starting with 26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.319268 5116 scope.go:117] "RemoveContainer" containerID="75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93" Mar 22 00:18:54 crc kubenswrapper[5116]: E0322 00:18:54.319547 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93\": container with ID starting with 75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93 not found: ID does not exist" containerID="75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.319593 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93"} err="failed to get container status \"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93\": rpc error: code = NotFound desc = could not find container \"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93\": container with ID starting with 75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.319610 5116 scope.go:117] "RemoveContainer" containerID="03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519" Mar 22 00:18:54 crc kubenswrapper[5116]: E0322 00:18:54.320012 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519\": container with ID starting with 03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519 not found: ID does not exist" containerID="03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.320033 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519"} err="failed to get container status \"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519\": rpc error: code = NotFound desc = could not find container \"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519\": container with ID starting with 03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.320047 5116 scope.go:117] "RemoveContainer" containerID="75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d" Mar 22 00:18:54 crc kubenswrapper[5116]: E0322 00:18:54.320287 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d\": container with ID starting with 75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d not found: ID does not exist" containerID="75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.320313 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d"} err="failed to get container status \"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d\": rpc error: code = NotFound desc = could not find container \"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d\": container with ID starting with 75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.320325 5116 scope.go:117] "RemoveContainer" containerID="f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375" Mar 22 00:18:54 crc kubenswrapper[5116]: E0322 00:18:54.320732 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375\": container with ID starting with f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375 not found: ID does not exist" containerID="f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.320751 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375"} err="failed to get container status \"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375\": rpc error: code = NotFound desc = could not find container \"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375\": container with ID starting with f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.320764 5116 scope.go:117] "RemoveContainer" containerID="b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.321052 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740"} err="failed to get container status \"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740\": rpc error: code = NotFound desc = could not find container \"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740\": container with ID starting with b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.321096 5116 scope.go:117] "RemoveContainer" containerID="d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.321434 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325"} err="failed to get container status \"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325\": rpc error: code = NotFound desc = could not find container \"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325\": container with ID starting with d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.321453 5116 scope.go:117] "RemoveContainer" containerID="c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.321707 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85"} err="failed to get container status \"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85\": rpc error: code = NotFound desc = could not find container \"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85\": container with ID starting with c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.321726 5116 scope.go:117] "RemoveContainer" containerID="8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.321916 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f"} err="failed to get container status \"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f\": rpc error: code = NotFound desc = could not find container \"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f\": container with ID starting with 8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.321934 5116 scope.go:117] "RemoveContainer" containerID="26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.322195 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62"} err="failed to get container status \"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62\": rpc error: code = NotFound desc = could not find container \"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62\": container with ID starting with 26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.322214 5116 scope.go:117] "RemoveContainer" containerID="75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.322369 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93"} err="failed to get container status \"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93\": rpc error: code = NotFound desc = could not find container \"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93\": container with ID starting with 75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.322387 5116 scope.go:117] "RemoveContainer" containerID="03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.322603 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519"} err="failed to get container status \"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519\": rpc error: code = NotFound desc = could not find container \"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519\": container with ID starting with 03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.322647 5116 scope.go:117] "RemoveContainer" containerID="75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.322829 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d"} err="failed to get container status \"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d\": rpc error: code = NotFound desc = could not find container \"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d\": container with ID starting with 75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.322847 5116 scope.go:117] "RemoveContainer" containerID="f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.323014 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375"} err="failed to get container status \"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375\": rpc error: code = NotFound desc = could not find container \"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375\": container with ID starting with f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.323037 5116 scope.go:117] "RemoveContainer" containerID="b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.323320 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740"} err="failed to get container status \"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740\": rpc error: code = NotFound desc = could not find container \"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740\": container with ID starting with b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.323338 5116 scope.go:117] "RemoveContainer" containerID="d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.323686 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325"} err="failed to get container status \"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325\": rpc error: code = NotFound desc = could not find container \"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325\": container with ID starting with d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.323711 5116 scope.go:117] "RemoveContainer" containerID="c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.323901 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85"} err="failed to get container status \"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85\": rpc error: code = NotFound desc = could not find container \"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85\": container with ID starting with c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.323921 5116 scope.go:117] "RemoveContainer" containerID="8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.324098 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f"} err="failed to get container status \"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f\": rpc error: code = NotFound desc = could not find container \"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f\": container with ID starting with 8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.324118 5116 scope.go:117] "RemoveContainer" containerID="26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.324394 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62"} err="failed to get container status \"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62\": rpc error: code = NotFound desc = could not find container \"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62\": container with ID starting with 26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.324418 5116 scope.go:117] "RemoveContainer" containerID="75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.324612 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93"} err="failed to get container status \"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93\": rpc error: code = NotFound desc = could not find container \"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93\": container with ID starting with 75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.324632 5116 scope.go:117] "RemoveContainer" containerID="03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.324882 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519"} err="failed to get container status \"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519\": rpc error: code = NotFound desc = could not find container \"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519\": container with ID starting with 03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.324944 5116 scope.go:117] "RemoveContainer" containerID="75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.325192 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d"} err="failed to get container status \"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d\": rpc error: code = NotFound desc = could not find container \"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d\": container with ID starting with 75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.325213 5116 scope.go:117] "RemoveContainer" containerID="f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.325383 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375"} err="failed to get container status \"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375\": rpc error: code = NotFound desc = could not find container \"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375\": container with ID starting with f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.325403 5116 scope.go:117] "RemoveContainer" containerID="b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.325941 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740"} err="failed to get container status \"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740\": rpc error: code = NotFound desc = could not find container \"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740\": container with ID starting with b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.325976 5116 scope.go:117] "RemoveContainer" containerID="d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.326207 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325"} err="failed to get container status \"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325\": rpc error: code = NotFound desc = could not find container \"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325\": container with ID starting with d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.326228 5116 scope.go:117] "RemoveContainer" containerID="c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.326409 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85"} err="failed to get container status \"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85\": rpc error: code = NotFound desc = could not find container \"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85\": container with ID starting with c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.326428 5116 scope.go:117] "RemoveContainer" containerID="8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.326669 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f"} err="failed to get container status \"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f\": rpc error: code = NotFound desc = could not find container \"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f\": container with ID starting with 8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.326686 5116 scope.go:117] "RemoveContainer" containerID="26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.326846 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62"} err="failed to get container status \"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62\": rpc error: code = NotFound desc = could not find container \"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62\": container with ID starting with 26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.326864 5116 scope.go:117] "RemoveContainer" containerID="75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.327053 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93"} err="failed to get container status \"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93\": rpc error: code = NotFound desc = could not find container \"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93\": container with ID starting with 75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.327074 5116 scope.go:117] "RemoveContainer" containerID="03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.327258 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519"} err="failed to get container status \"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519\": rpc error: code = NotFound desc = could not find container \"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519\": container with ID starting with 03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.327281 5116 scope.go:117] "RemoveContainer" containerID="75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.327475 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d"} err="failed to get container status \"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d\": rpc error: code = NotFound desc = could not find container \"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d\": container with ID starting with 75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.327492 5116 scope.go:117] "RemoveContainer" containerID="f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.327659 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375"} err="failed to get container status \"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375\": rpc error: code = NotFound desc = could not find container \"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375\": container with ID starting with f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.349038 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-env-overrides\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.349102 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-run-ovn\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.349124 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-etc-openvswitch\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.349149 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.349278 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-run-ovn\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.349331 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.349398 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-etc-openvswitch\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350087 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-env-overrides\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350160 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-var-lib-openvswitch\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350303 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-ovn-node-metrics-cert\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350324 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-var-lib-openvswitch\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350347 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-run-openvswitch\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350419 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-kubelet\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350473 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-run-netns\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350493 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-run-systemd\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350593 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-cni-netd\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350604 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-kubelet\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350635 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-node-log\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350671 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-run-systemd\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350673 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-cni-netd\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350635 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-run-netns\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350727 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-node-log\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350775 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-slash\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350832 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2zx6j\" (UniqueName: \"kubernetes.io/projected/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-kube-api-access-2zx6j\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350878 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-slash\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350985 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-run-openvswitch\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.351022 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-run-ovn-kubernetes\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350992 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-run-ovn-kubernetes\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.351057 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-ovnkube-script-lib\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.351098 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-cni-bin\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.351139 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-log-socket\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.351237 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-ovnkube-config\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.351305 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-systemd-units\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.351320 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-cni-bin\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.351352 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-systemd-units\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.351475 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-log-socket\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.351945 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-ovnkube-script-lib\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.351963 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-ovnkube-config\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.367035 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-ovn-node-metrics-cert\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.369783 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zx6j\" (UniqueName: \"kubernetes.io/projected/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-kube-api-access-2zx6j\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.386044 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.391279 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-n9zvq"] Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.400282 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-n9zvq"] Mar 22 00:18:54 crc kubenswrapper[5116]: W0322 00:18:54.423268 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16fdd440_8f3d_4ab4_a86e_85bf7c1ee883.slice/crio-7bf7458e833fbff321d3d4569c27d7fc82d52e776448bb3c7fe681018ed9a3eb WatchSource:0}: Error finding container 7bf7458e833fbff321d3d4569c27d7fc82d52e776448bb3c7fe681018ed9a3eb: Status 404 returned error can't find the container with id 7bf7458e833fbff321d3d4569c27d7fc82d52e776448bb3c7fe681018ed9a3eb Mar 22 00:18:55 crc kubenswrapper[5116]: I0322 00:18:55.063118 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9sq6c_5188f25b-37c3-46f1-b939-199c6e082848/kube-multus/0.log" Mar 22 00:18:55 crc kubenswrapper[5116]: I0322 00:18:55.063299 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9sq6c" event={"ID":"5188f25b-37c3-46f1-b939-199c6e082848","Type":"ContainerStarted","Data":"4403801b89ed99b6ab9a49e3e33da08a8d566065fa2e776b22226d031e88abf8"} Mar 22 00:18:55 crc kubenswrapper[5116]: I0322 00:18:55.068643 5116 generic.go:358] "Generic (PLEG): container finished" podID="16fdd440-8f3d-4ab4-a86e-85bf7c1ee883" containerID="9637c7982f5541f4863e209ccefbe5a3de02eb6f01f5fd44a7c7428079f51db3" exitCode=0 Mar 22 00:18:55 crc kubenswrapper[5116]: I0322 00:18:55.068720 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" event={"ID":"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883","Type":"ContainerDied","Data":"9637c7982f5541f4863e209ccefbe5a3de02eb6f01f5fd44a7c7428079f51db3"} Mar 22 00:18:55 crc kubenswrapper[5116]: I0322 00:18:55.068742 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" event={"ID":"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883","Type":"ContainerStarted","Data":"7bf7458e833fbff321d3d4569c27d7fc82d52e776448bb3c7fe681018ed9a3eb"} Mar 22 00:18:55 crc kubenswrapper[5116]: I0322 00:18:55.074077 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" event={"ID":"d9a45017-2c2a-4fa3-9277-4d1d8b674faf","Type":"ContainerStarted","Data":"26e2afe933d986e2c7a725e228aeba5907387b34ecaec01a61cfa250ee737719"} Mar 22 00:18:55 crc kubenswrapper[5116]: I0322 00:18:55.074117 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" event={"ID":"d9a45017-2c2a-4fa3-9277-4d1d8b674faf","Type":"ContainerStarted","Data":"6a45e13aa9c81e20728a9bc8d289081021e3ea28336dc4a228b385a1765b48f0"} Mar 22 00:18:55 crc kubenswrapper[5116]: I0322 00:18:55.074133 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" event={"ID":"d9a45017-2c2a-4fa3-9277-4d1d8b674faf","Type":"ContainerStarted","Data":"c6bd055f4ea29e0ea189cc3bc424ed69095f061fb751917ab55890b51db96f8e"} Mar 22 00:18:55 crc kubenswrapper[5116]: I0322 00:18:55.098105 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" podStartSLOduration=2.098088355 podStartE2EDuration="2.098088355s" podCreationTimestamp="2026-03-22 00:18:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:18:55.094419711 +0000 UTC m=+606.116721084" watchObservedRunningTime="2026-03-22 00:18:55.098088355 +0000 UTC m=+606.120389728" Mar 22 00:18:55 crc kubenswrapper[5116]: I0322 00:18:55.705605 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e17ab744-68a7-4a24-8ef2-556696d752fb" path="/var/lib/kubelet/pods/e17ab744-68a7-4a24-8ef2-556696d752fb/volumes" Mar 22 00:18:55 crc kubenswrapper[5116]: I0322 00:18:55.706886 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" path="/var/lib/kubelet/pods/ec484e57-1508-45a3-99a3-51dfa8ef6195/volumes" Mar 22 00:18:56 crc kubenswrapper[5116]: I0322 00:18:56.085641 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" event={"ID":"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883","Type":"ContainerStarted","Data":"8ca5e805f39da4e140c396a8f79429ea198e0b1151f50173e35116ca3b605426"} Mar 22 00:18:56 crc kubenswrapper[5116]: I0322 00:18:56.085685 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" event={"ID":"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883","Type":"ContainerStarted","Data":"e340de5046b550673ff3fed848836b28e00dde668e71925c8644ca64ff2d85c4"} Mar 22 00:18:56 crc kubenswrapper[5116]: I0322 00:18:56.085695 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" event={"ID":"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883","Type":"ContainerStarted","Data":"661756b1ac6636342786b79269815da09abc6fe26ff33701897a5b1bfff92e8c"} Mar 22 00:18:56 crc kubenswrapper[5116]: I0322 00:18:56.085704 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" event={"ID":"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883","Type":"ContainerStarted","Data":"315b8e7fb1607aa4c1b643495f6de8b806b06588a5eb7e11d996ddc9e320d085"} Mar 22 00:18:56 crc kubenswrapper[5116]: I0322 00:18:56.085714 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" event={"ID":"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883","Type":"ContainerStarted","Data":"f5697292eeb4e45b09fba5203605fa49faae7deb1295a75e0b8b92e9ddf4ee32"} Mar 22 00:18:56 crc kubenswrapper[5116]: I0322 00:18:56.085725 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" event={"ID":"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883","Type":"ContainerStarted","Data":"577160a3e8d068db15afd3a913144caa27408dcca13cda44c8a9e6fe4896a096"} Mar 22 00:18:58 crc kubenswrapper[5116]: I0322 00:18:58.098557 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" event={"ID":"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883","Type":"ContainerStarted","Data":"173b0b1f3d2caca659e6628f85cb4c9d689999d6a6ffb57b65d5fe3d25ced049"} Mar 22 00:19:01 crc kubenswrapper[5116]: I0322 00:19:01.121160 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" event={"ID":"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883","Type":"ContainerStarted","Data":"070bfc5b096e07b982c3085b0f52711e8f80913d668dfd5ae5d7a9d5a2ec3fc7"} Mar 22 00:19:01 crc kubenswrapper[5116]: I0322 00:19:01.121780 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:19:01 crc kubenswrapper[5116]: I0322 00:19:01.121795 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:19:01 crc kubenswrapper[5116]: I0322 00:19:01.121804 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:19:01 crc kubenswrapper[5116]: I0322 00:19:01.148328 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:19:01 crc kubenswrapper[5116]: I0322 00:19:01.149699 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:19:01 crc kubenswrapper[5116]: I0322 00:19:01.155962 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" podStartSLOduration=7.155943691 podStartE2EDuration="7.155943691s" podCreationTimestamp="2026-03-22 00:18:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:19:01.153316763 +0000 UTC m=+612.175618146" watchObservedRunningTime="2026-03-22 00:19:01.155943691 +0000 UTC m=+612.178245064" Mar 22 00:19:23 crc kubenswrapper[5116]: I0322 00:19:23.057299 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:19:23 crc kubenswrapper[5116]: I0322 00:19:23.058054 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:19:23 crc kubenswrapper[5116]: I0322 00:19:23.058113 5116 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:19:23 crc kubenswrapper[5116]: I0322 00:19:23.058739 5116 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"29fe2544ad2992d4d45322cba6cd2af82b6ecb0aef833ef494e780ff8287385e"} pod="openshift-machine-config-operator/machine-config-daemon-66g6d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 22 00:19:23 crc kubenswrapper[5116]: I0322 00:19:23.058813 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" containerID="cri-o://29fe2544ad2992d4d45322cba6cd2af82b6ecb0aef833ef494e780ff8287385e" gracePeriod=600 Mar 22 00:19:23 crc kubenswrapper[5116]: I0322 00:19:23.255100 5116 generic.go:358] "Generic (PLEG): container finished" podID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerID="29fe2544ad2992d4d45322cba6cd2af82b6ecb0aef833ef494e780ff8287385e" exitCode=0 Mar 22 00:19:23 crc kubenswrapper[5116]: I0322 00:19:23.255196 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerDied","Data":"29fe2544ad2992d4d45322cba6cd2af82b6ecb0aef833ef494e780ff8287385e"} Mar 22 00:19:23 crc kubenswrapper[5116]: I0322 00:19:23.255251 5116 scope.go:117] "RemoveContainer" containerID="b51ee51711afcb946b670f0d81e2401692bf98b75a88b05923f26883062fdb6e" Mar 22 00:19:24 crc kubenswrapper[5116]: I0322 00:19:24.267826 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerStarted","Data":"2bace879b3cb84a0484101d5bad3c5693b65e8c5fa47a655bcdd4b8fee4ab4a2"} Mar 22 00:19:33 crc kubenswrapper[5116]: I0322 00:19:33.171623 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:19:58 crc kubenswrapper[5116]: I0322 00:19:58.496078 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m2vjz"] Mar 22 00:19:58 crc kubenswrapper[5116]: I0322 00:19:58.496946 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m2vjz" podUID="4f62de57-b304-469a-ab77-b6796a6a482c" containerName="registry-server" containerID="cri-o://1d937792336588d7f4a3fd335a18f87c926bcb88e8eceb369806c30a1f306494" gracePeriod=30 Mar 22 00:19:58 crc kubenswrapper[5116]: I0322 00:19:58.864025 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m2vjz" Mar 22 00:19:58 crc kubenswrapper[5116]: I0322 00:19:58.939415 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f62de57-b304-469a-ab77-b6796a6a482c-utilities\") pod \"4f62de57-b304-469a-ab77-b6796a6a482c\" (UID: \"4f62de57-b304-469a-ab77-b6796a6a482c\") " Mar 22 00:19:58 crc kubenswrapper[5116]: I0322 00:19:58.939634 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwxf9\" (UniqueName: \"kubernetes.io/projected/4f62de57-b304-469a-ab77-b6796a6a482c-kube-api-access-xwxf9\") pod \"4f62de57-b304-469a-ab77-b6796a6a482c\" (UID: \"4f62de57-b304-469a-ab77-b6796a6a482c\") " Mar 22 00:19:58 crc kubenswrapper[5116]: I0322 00:19:58.939699 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f62de57-b304-469a-ab77-b6796a6a482c-catalog-content\") pod \"4f62de57-b304-469a-ab77-b6796a6a482c\" (UID: \"4f62de57-b304-469a-ab77-b6796a6a482c\") " Mar 22 00:19:58 crc kubenswrapper[5116]: I0322 00:19:58.941427 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f62de57-b304-469a-ab77-b6796a6a482c-utilities" (OuterVolumeSpecName: "utilities") pod "4f62de57-b304-469a-ab77-b6796a6a482c" (UID: "4f62de57-b304-469a-ab77-b6796a6a482c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:19:58 crc kubenswrapper[5116]: I0322 00:19:58.945731 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f62de57-b304-469a-ab77-b6796a6a482c-kube-api-access-xwxf9" (OuterVolumeSpecName: "kube-api-access-xwxf9") pod "4f62de57-b304-469a-ab77-b6796a6a482c" (UID: "4f62de57-b304-469a-ab77-b6796a6a482c"). InnerVolumeSpecName "kube-api-access-xwxf9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:19:58 crc kubenswrapper[5116]: I0322 00:19:58.964082 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f62de57-b304-469a-ab77-b6796a6a482c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f62de57-b304-469a-ab77-b6796a6a482c" (UID: "4f62de57-b304-469a-ab77-b6796a6a482c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.042311 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xwxf9\" (UniqueName: \"kubernetes.io/projected/4f62de57-b304-469a-ab77-b6796a6a482c-kube-api-access-xwxf9\") on node \"crc\" DevicePath \"\"" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.042366 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f62de57-b304-469a-ab77-b6796a6a482c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.042384 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f62de57-b304-469a-ab77-b6796a6a482c-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.495675 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5d9d95bf5b-5kxj9"] Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.496491 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f62de57-b304-469a-ab77-b6796a6a482c" containerName="extract-utilities" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.496518 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f62de57-b304-469a-ab77-b6796a6a482c" containerName="extract-utilities" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.496539 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f62de57-b304-469a-ab77-b6796a6a482c" containerName="extract-content" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.496547 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f62de57-b304-469a-ab77-b6796a6a482c" containerName="extract-content" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.496560 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f62de57-b304-469a-ab77-b6796a6a482c" containerName="registry-server" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.496569 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f62de57-b304-469a-ab77-b6796a6a482c" containerName="registry-server" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.496691 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f62de57-b304-469a-ab77-b6796a6a482c" containerName="registry-server" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.503712 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.504454 5116 generic.go:358] "Generic (PLEG): container finished" podID="4f62de57-b304-469a-ab77-b6796a6a482c" containerID="1d937792336588d7f4a3fd335a18f87c926bcb88e8eceb369806c30a1f306494" exitCode=0 Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.504520 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m2vjz" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.504555 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m2vjz" event={"ID":"4f62de57-b304-469a-ab77-b6796a6a482c","Type":"ContainerDied","Data":"1d937792336588d7f4a3fd335a18f87c926bcb88e8eceb369806c30a1f306494"} Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.504597 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m2vjz" event={"ID":"4f62de57-b304-469a-ab77-b6796a6a482c","Type":"ContainerDied","Data":"2bfb9b2cbc2bf9b208015b497f6aa3a527695246821234f87810cec830fcdda1"} Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.504619 5116 scope.go:117] "RemoveContainer" containerID="1d937792336588d7f4a3fd335a18f87c926bcb88e8eceb369806c30a1f306494" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.508623 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5d9d95bf5b-5kxj9"] Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.524001 5116 scope.go:117] "RemoveContainer" containerID="6173164482aab8a51a8f73c58ede47ca239c1ae1606dc490f2a8bb20ad689155" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.546412 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m2vjz"] Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.556557 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m2vjz"] Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.557400 5116 scope.go:117] "RemoveContainer" containerID="27e9af3227635826bdc0e57820bf83bfd970557bab506c2c6b63a6bb5dd29c2e" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.575307 5116 scope.go:117] "RemoveContainer" containerID="1d937792336588d7f4a3fd335a18f87c926bcb88e8eceb369806c30a1f306494" Mar 22 00:19:59 crc kubenswrapper[5116]: E0322 00:19:59.576059 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d937792336588d7f4a3fd335a18f87c926bcb88e8eceb369806c30a1f306494\": container with ID starting with 1d937792336588d7f4a3fd335a18f87c926bcb88e8eceb369806c30a1f306494 not found: ID does not exist" containerID="1d937792336588d7f4a3fd335a18f87c926bcb88e8eceb369806c30a1f306494" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.576100 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d937792336588d7f4a3fd335a18f87c926bcb88e8eceb369806c30a1f306494"} err="failed to get container status \"1d937792336588d7f4a3fd335a18f87c926bcb88e8eceb369806c30a1f306494\": rpc error: code = NotFound desc = could not find container \"1d937792336588d7f4a3fd335a18f87c926bcb88e8eceb369806c30a1f306494\": container with ID starting with 1d937792336588d7f4a3fd335a18f87c926bcb88e8eceb369806c30a1f306494 not found: ID does not exist" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.576121 5116 scope.go:117] "RemoveContainer" containerID="6173164482aab8a51a8f73c58ede47ca239c1ae1606dc490f2a8bb20ad689155" Mar 22 00:19:59 crc kubenswrapper[5116]: E0322 00:19:59.576485 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6173164482aab8a51a8f73c58ede47ca239c1ae1606dc490f2a8bb20ad689155\": container with ID starting with 6173164482aab8a51a8f73c58ede47ca239c1ae1606dc490f2a8bb20ad689155 not found: ID does not exist" containerID="6173164482aab8a51a8f73c58ede47ca239c1ae1606dc490f2a8bb20ad689155" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.576528 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6173164482aab8a51a8f73c58ede47ca239c1ae1606dc490f2a8bb20ad689155"} err="failed to get container status \"6173164482aab8a51a8f73c58ede47ca239c1ae1606dc490f2a8bb20ad689155\": rpc error: code = NotFound desc = could not find container \"6173164482aab8a51a8f73c58ede47ca239c1ae1606dc490f2a8bb20ad689155\": container with ID starting with 6173164482aab8a51a8f73c58ede47ca239c1ae1606dc490f2a8bb20ad689155 not found: ID does not exist" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.576554 5116 scope.go:117] "RemoveContainer" containerID="27e9af3227635826bdc0e57820bf83bfd970557bab506c2c6b63a6bb5dd29c2e" Mar 22 00:19:59 crc kubenswrapper[5116]: E0322 00:19:59.577055 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27e9af3227635826bdc0e57820bf83bfd970557bab506c2c6b63a6bb5dd29c2e\": container with ID starting with 27e9af3227635826bdc0e57820bf83bfd970557bab506c2c6b63a6bb5dd29c2e not found: ID does not exist" containerID="27e9af3227635826bdc0e57820bf83bfd970557bab506c2c6b63a6bb5dd29c2e" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.577088 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27e9af3227635826bdc0e57820bf83bfd970557bab506c2c6b63a6bb5dd29c2e"} err="failed to get container status \"27e9af3227635826bdc0e57820bf83bfd970557bab506c2c6b63a6bb5dd29c2e\": rpc error: code = NotFound desc = could not find container \"27e9af3227635826bdc0e57820bf83bfd970557bab506c2c6b63a6bb5dd29c2e\": container with ID starting with 27e9af3227635826bdc0e57820bf83bfd970557bab506c2c6b63a6bb5dd29c2e not found: ID does not exist" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.653276 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/85d2e292-d36f-4bca-82b5-0a2770f13848-bound-sa-token\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.653330 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/85d2e292-d36f-4bca-82b5-0a2770f13848-registry-tls\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.653380 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/85d2e292-d36f-4bca-82b5-0a2770f13848-ca-trust-extracted\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.653408 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/85d2e292-d36f-4bca-82b5-0a2770f13848-installation-pull-secrets\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.653442 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.653545 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85d2e292-d36f-4bca-82b5-0a2770f13848-trusted-ca\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.653631 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6wl8\" (UniqueName: \"kubernetes.io/projected/85d2e292-d36f-4bca-82b5-0a2770f13848-kube-api-access-z6wl8\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.653719 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/85d2e292-d36f-4bca-82b5-0a2770f13848-registry-certificates\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.673238 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.704388 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f62de57-b304-469a-ab77-b6796a6a482c" path="/var/lib/kubelet/pods/4f62de57-b304-469a-ab77-b6796a6a482c/volumes" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.754917 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/85d2e292-d36f-4bca-82b5-0a2770f13848-registry-certificates\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.755002 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/85d2e292-d36f-4bca-82b5-0a2770f13848-bound-sa-token\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.755201 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/85d2e292-d36f-4bca-82b5-0a2770f13848-registry-tls\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.755442 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/85d2e292-d36f-4bca-82b5-0a2770f13848-ca-trust-extracted\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.755517 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/85d2e292-d36f-4bca-82b5-0a2770f13848-installation-pull-secrets\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.755673 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85d2e292-d36f-4bca-82b5-0a2770f13848-trusted-ca\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.755747 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6wl8\" (UniqueName: \"kubernetes.io/projected/85d2e292-d36f-4bca-82b5-0a2770f13848-kube-api-access-z6wl8\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.755962 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/85d2e292-d36f-4bca-82b5-0a2770f13848-ca-trust-extracted\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.756766 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/85d2e292-d36f-4bca-82b5-0a2770f13848-registry-certificates\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.756776 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85d2e292-d36f-4bca-82b5-0a2770f13848-trusted-ca\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.760640 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/85d2e292-d36f-4bca-82b5-0a2770f13848-registry-tls\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.761339 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/85d2e292-d36f-4bca-82b5-0a2770f13848-installation-pull-secrets\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.773260 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/85d2e292-d36f-4bca-82b5-0a2770f13848-bound-sa-token\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.778881 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6wl8\" (UniqueName: \"kubernetes.io/projected/85d2e292-d36f-4bca-82b5-0a2770f13848-kube-api-access-z6wl8\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.827443 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:20:00 crc kubenswrapper[5116]: I0322 00:20:00.141383 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29568980-ksbk2"] Mar 22 00:20:00 crc kubenswrapper[5116]: I0322 00:20:00.152770 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568980-ksbk2"] Mar 22 00:20:00 crc kubenswrapper[5116]: I0322 00:20:00.152865 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568980-ksbk2" Mar 22 00:20:00 crc kubenswrapper[5116]: I0322 00:20:00.155698 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 22 00:20:00 crc kubenswrapper[5116]: I0322 00:20:00.156487 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 22 00:20:00 crc kubenswrapper[5116]: I0322 00:20:00.160757 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-zsw2q\"" Mar 22 00:20:00 crc kubenswrapper[5116]: I0322 00:20:00.264261 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fld7p\" (UniqueName: \"kubernetes.io/projected/11840734-dc87-4532-b341-aeb889f011c4-kube-api-access-fld7p\") pod \"auto-csr-approver-29568980-ksbk2\" (UID: \"11840734-dc87-4532-b341-aeb889f011c4\") " pod="openshift-infra/auto-csr-approver-29568980-ksbk2" Mar 22 00:20:00 crc kubenswrapper[5116]: I0322 00:20:00.281228 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5d9d95bf5b-5kxj9"] Mar 22 00:20:00 crc kubenswrapper[5116]: I0322 00:20:00.365884 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fld7p\" (UniqueName: \"kubernetes.io/projected/11840734-dc87-4532-b341-aeb889f011c4-kube-api-access-fld7p\") pod \"auto-csr-approver-29568980-ksbk2\" (UID: \"11840734-dc87-4532-b341-aeb889f011c4\") " pod="openshift-infra/auto-csr-approver-29568980-ksbk2" Mar 22 00:20:00 crc kubenswrapper[5116]: I0322 00:20:00.385283 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fld7p\" (UniqueName: \"kubernetes.io/projected/11840734-dc87-4532-b341-aeb889f011c4-kube-api-access-fld7p\") pod \"auto-csr-approver-29568980-ksbk2\" (UID: \"11840734-dc87-4532-b341-aeb889f011c4\") " pod="openshift-infra/auto-csr-approver-29568980-ksbk2" Mar 22 00:20:00 crc kubenswrapper[5116]: I0322 00:20:00.470405 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568980-ksbk2" Mar 22 00:20:00 crc kubenswrapper[5116]: I0322 00:20:00.519265 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" event={"ID":"85d2e292-d36f-4bca-82b5-0a2770f13848","Type":"ContainerStarted","Data":"51c2db15e29179571dd629b15fa52011ea2ebfbdc5e7921b02b67557958ca769"} Mar 22 00:20:00 crc kubenswrapper[5116]: I0322 00:20:00.519341 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" event={"ID":"85d2e292-d36f-4bca-82b5-0a2770f13848","Type":"ContainerStarted","Data":"99744a56eaf9459785159148e99774d3cd556e1360a660b8b2e0de57813e055e"} Mar 22 00:20:00 crc kubenswrapper[5116]: I0322 00:20:00.520245 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:20:00 crc kubenswrapper[5116]: I0322 00:20:00.551705 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" podStartSLOduration=1.551689778 podStartE2EDuration="1.551689778s" podCreationTimestamp="2026-03-22 00:19:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:20:00.546541362 +0000 UTC m=+671.568842735" watchObservedRunningTime="2026-03-22 00:20:00.551689778 +0000 UTC m=+671.573991151" Mar 22 00:20:00 crc kubenswrapper[5116]: I0322 00:20:00.651974 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568980-ksbk2"] Mar 22 00:20:01 crc kubenswrapper[5116]: I0322 00:20:01.524930 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568980-ksbk2" event={"ID":"11840734-dc87-4532-b341-aeb889f011c4","Type":"ContainerStarted","Data":"c361e0eea64867e1703b78fe98851ab9c0ff404fb999e8f69b6d6334202ab1c4"} Mar 22 00:20:02 crc kubenswrapper[5116]: I0322 00:20:02.185119 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48"] Mar 22 00:20:02 crc kubenswrapper[5116]: I0322 00:20:02.190242 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" Mar 22 00:20:02 crc kubenswrapper[5116]: I0322 00:20:02.191900 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-b2ccr\"" Mar 22 00:20:02 crc kubenswrapper[5116]: I0322 00:20:02.234130 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48"] Mar 22 00:20:02 crc kubenswrapper[5116]: I0322 00:20:02.291551 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whz7m\" (UniqueName: \"kubernetes.io/projected/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-kube-api-access-whz7m\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48\" (UID: \"dd47ba40-7b8e-4f2c-8e16-62a5f085def8\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" Mar 22 00:20:02 crc kubenswrapper[5116]: I0322 00:20:02.291639 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48\" (UID: \"dd47ba40-7b8e-4f2c-8e16-62a5f085def8\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" Mar 22 00:20:02 crc kubenswrapper[5116]: I0322 00:20:02.291674 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48\" (UID: \"dd47ba40-7b8e-4f2c-8e16-62a5f085def8\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" Mar 22 00:20:02 crc kubenswrapper[5116]: I0322 00:20:02.393260 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-whz7m\" (UniqueName: \"kubernetes.io/projected/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-kube-api-access-whz7m\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48\" (UID: \"dd47ba40-7b8e-4f2c-8e16-62a5f085def8\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" Mar 22 00:20:02 crc kubenswrapper[5116]: I0322 00:20:02.393333 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48\" (UID: \"dd47ba40-7b8e-4f2c-8e16-62a5f085def8\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" Mar 22 00:20:02 crc kubenswrapper[5116]: I0322 00:20:02.393371 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48\" (UID: \"dd47ba40-7b8e-4f2c-8e16-62a5f085def8\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" Mar 22 00:20:02 crc kubenswrapper[5116]: I0322 00:20:02.394121 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48\" (UID: \"dd47ba40-7b8e-4f2c-8e16-62a5f085def8\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" Mar 22 00:20:02 crc kubenswrapper[5116]: I0322 00:20:02.394794 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48\" (UID: \"dd47ba40-7b8e-4f2c-8e16-62a5f085def8\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" Mar 22 00:20:02 crc kubenswrapper[5116]: I0322 00:20:02.417755 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-whz7m\" (UniqueName: \"kubernetes.io/projected/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-kube-api-access-whz7m\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48\" (UID: \"dd47ba40-7b8e-4f2c-8e16-62a5f085def8\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" Mar 22 00:20:02 crc kubenswrapper[5116]: E0322 00:20:02.459154 5116 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11840734_dc87_4532_b341_aeb889f011c4.slice/crio-conmon-8af1cbb249920bcf15fdbe5b5c0f46798a3269524c401605be7ddebc1e8b62d8.scope\": RecentStats: unable to find data in memory cache]" Mar 22 00:20:02 crc kubenswrapper[5116]: I0322 00:20:02.514658 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" Mar 22 00:20:02 crc kubenswrapper[5116]: I0322 00:20:02.533828 5116 generic.go:358] "Generic (PLEG): container finished" podID="11840734-dc87-4532-b341-aeb889f011c4" containerID="8af1cbb249920bcf15fdbe5b5c0f46798a3269524c401605be7ddebc1e8b62d8" exitCode=0 Mar 22 00:20:02 crc kubenswrapper[5116]: I0322 00:20:02.534051 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568980-ksbk2" event={"ID":"11840734-dc87-4532-b341-aeb889f011c4","Type":"ContainerDied","Data":"8af1cbb249920bcf15fdbe5b5c0f46798a3269524c401605be7ddebc1e8b62d8"} Mar 22 00:20:02 crc kubenswrapper[5116]: I0322 00:20:02.736243 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48"] Mar 22 00:20:02 crc kubenswrapper[5116]: W0322 00:20:02.751030 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd47ba40_7b8e_4f2c_8e16_62a5f085def8.slice/crio-87b7eae02562b9b861197a3a98f6e647d0bf6b7eeb198e19eb3ec8c7323275d2 WatchSource:0}: Error finding container 87b7eae02562b9b861197a3a98f6e647d0bf6b7eeb198e19eb3ec8c7323275d2: Status 404 returned error can't find the container with id 87b7eae02562b9b861197a3a98f6e647d0bf6b7eeb198e19eb3ec8c7323275d2 Mar 22 00:20:03 crc kubenswrapper[5116]: I0322 00:20:03.541946 5116 generic.go:358] "Generic (PLEG): container finished" podID="dd47ba40-7b8e-4f2c-8e16-62a5f085def8" containerID="44d25f50e90508d11aa6151893869dc818c08d23f4ec081d18453e99cc999cb4" exitCode=0 Mar 22 00:20:03 crc kubenswrapper[5116]: I0322 00:20:03.542011 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" event={"ID":"dd47ba40-7b8e-4f2c-8e16-62a5f085def8","Type":"ContainerDied","Data":"44d25f50e90508d11aa6151893869dc818c08d23f4ec081d18453e99cc999cb4"} Mar 22 00:20:03 crc kubenswrapper[5116]: I0322 00:20:03.542432 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" event={"ID":"dd47ba40-7b8e-4f2c-8e16-62a5f085def8","Type":"ContainerStarted","Data":"87b7eae02562b9b861197a3a98f6e647d0bf6b7eeb198e19eb3ec8c7323275d2"} Mar 22 00:20:03 crc kubenswrapper[5116]: I0322 00:20:03.786998 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568980-ksbk2" Mar 22 00:20:03 crc kubenswrapper[5116]: I0322 00:20:03.915750 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fld7p\" (UniqueName: \"kubernetes.io/projected/11840734-dc87-4532-b341-aeb889f011c4-kube-api-access-fld7p\") pod \"11840734-dc87-4532-b341-aeb889f011c4\" (UID: \"11840734-dc87-4532-b341-aeb889f011c4\") " Mar 22 00:20:03 crc kubenswrapper[5116]: I0322 00:20:03.925491 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11840734-dc87-4532-b341-aeb889f011c4-kube-api-access-fld7p" (OuterVolumeSpecName: "kube-api-access-fld7p") pod "11840734-dc87-4532-b341-aeb889f011c4" (UID: "11840734-dc87-4532-b341-aeb889f011c4"). InnerVolumeSpecName "kube-api-access-fld7p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:20:04 crc kubenswrapper[5116]: I0322 00:20:04.017383 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fld7p\" (UniqueName: \"kubernetes.io/projected/11840734-dc87-4532-b341-aeb889f011c4-kube-api-access-fld7p\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:04 crc kubenswrapper[5116]: I0322 00:20:04.553719 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568980-ksbk2" Mar 22 00:20:04 crc kubenswrapper[5116]: I0322 00:20:04.553708 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568980-ksbk2" event={"ID":"11840734-dc87-4532-b341-aeb889f011c4","Type":"ContainerDied","Data":"c361e0eea64867e1703b78fe98851ab9c0ff404fb999e8f69b6d6334202ab1c4"} Mar 22 00:20:04 crc kubenswrapper[5116]: I0322 00:20:04.554319 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c361e0eea64867e1703b78fe98851ab9c0ff404fb999e8f69b6d6334202ab1c4" Mar 22 00:20:04 crc kubenswrapper[5116]: I0322 00:20:04.847794 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29568974-w8j5j"] Mar 22 00:20:04 crc kubenswrapper[5116]: I0322 00:20:04.852050 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29568974-w8j5j"] Mar 22 00:20:05 crc kubenswrapper[5116]: I0322 00:20:05.562064 5116 generic.go:358] "Generic (PLEG): container finished" podID="dd47ba40-7b8e-4f2c-8e16-62a5f085def8" containerID="ec351302955a52d95c88ee77bc5d5bfac4e35eb69fcb5e061e7f64cb8d6956b7" exitCode=0 Mar 22 00:20:05 crc kubenswrapper[5116]: I0322 00:20:05.562111 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" event={"ID":"dd47ba40-7b8e-4f2c-8e16-62a5f085def8","Type":"ContainerDied","Data":"ec351302955a52d95c88ee77bc5d5bfac4e35eb69fcb5e061e7f64cb8d6956b7"} Mar 22 00:20:05 crc kubenswrapper[5116]: I0322 00:20:05.709073 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b94e50a-fe81-48fe-a23a-c15956c06d21" path="/var/lib/kubelet/pods/3b94e50a-fe81-48fe-a23a-c15956c06d21/volumes" Mar 22 00:20:06 crc kubenswrapper[5116]: I0322 00:20:06.573475 5116 generic.go:358] "Generic (PLEG): container finished" podID="dd47ba40-7b8e-4f2c-8e16-62a5f085def8" containerID="09c8ce3bf4207242310e44b1cfbd97c2b09a09c2b340457bb558675ee6f5519e" exitCode=0 Mar 22 00:20:06 crc kubenswrapper[5116]: I0322 00:20:06.573605 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" event={"ID":"dd47ba40-7b8e-4f2c-8e16-62a5f085def8","Type":"ContainerDied","Data":"09c8ce3bf4207242310e44b1cfbd97c2b09a09c2b340457bb558675ee6f5519e"} Mar 22 00:20:07 crc kubenswrapper[5116]: I0322 00:20:07.792541 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" Mar 22 00:20:07 crc kubenswrapper[5116]: I0322 00:20:07.876693 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-util\") pod \"dd47ba40-7b8e-4f2c-8e16-62a5f085def8\" (UID: \"dd47ba40-7b8e-4f2c-8e16-62a5f085def8\") " Mar 22 00:20:07 crc kubenswrapper[5116]: I0322 00:20:07.876807 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-bundle\") pod \"dd47ba40-7b8e-4f2c-8e16-62a5f085def8\" (UID: \"dd47ba40-7b8e-4f2c-8e16-62a5f085def8\") " Mar 22 00:20:07 crc kubenswrapper[5116]: I0322 00:20:07.876848 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whz7m\" (UniqueName: \"kubernetes.io/projected/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-kube-api-access-whz7m\") pod \"dd47ba40-7b8e-4f2c-8e16-62a5f085def8\" (UID: \"dd47ba40-7b8e-4f2c-8e16-62a5f085def8\") " Mar 22 00:20:07 crc kubenswrapper[5116]: I0322 00:20:07.880118 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-bundle" (OuterVolumeSpecName: "bundle") pod "dd47ba40-7b8e-4f2c-8e16-62a5f085def8" (UID: "dd47ba40-7b8e-4f2c-8e16-62a5f085def8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:20:07 crc kubenswrapper[5116]: I0322 00:20:07.885113 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-kube-api-access-whz7m" (OuterVolumeSpecName: "kube-api-access-whz7m") pod "dd47ba40-7b8e-4f2c-8e16-62a5f085def8" (UID: "dd47ba40-7b8e-4f2c-8e16-62a5f085def8"). InnerVolumeSpecName "kube-api-access-whz7m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:20:07 crc kubenswrapper[5116]: I0322 00:20:07.888883 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-util" (OuterVolumeSpecName: "util") pod "dd47ba40-7b8e-4f2c-8e16-62a5f085def8" (UID: "dd47ba40-7b8e-4f2c-8e16-62a5f085def8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:20:07 crc kubenswrapper[5116]: I0322 00:20:07.978585 5116 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-util\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:07 crc kubenswrapper[5116]: I0322 00:20:07.978936 5116 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-bundle\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:07 crc kubenswrapper[5116]: I0322 00:20:07.978954 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-whz7m\" (UniqueName: \"kubernetes.io/projected/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-kube-api-access-whz7m\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.406301 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn"] Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.407227 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="11840734-dc87-4532-b341-aeb889f011c4" containerName="oc" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.407260 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="11840734-dc87-4532-b341-aeb889f011c4" containerName="oc" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.407286 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd47ba40-7b8e-4f2c-8e16-62a5f085def8" containerName="util" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.407296 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd47ba40-7b8e-4f2c-8e16-62a5f085def8" containerName="util" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.407316 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd47ba40-7b8e-4f2c-8e16-62a5f085def8" containerName="extract" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.407329 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd47ba40-7b8e-4f2c-8e16-62a5f085def8" containerName="extract" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.407370 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd47ba40-7b8e-4f2c-8e16-62a5f085def8" containerName="pull" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.407379 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd47ba40-7b8e-4f2c-8e16-62a5f085def8" containerName="pull" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.407524 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="11840734-dc87-4532-b341-aeb889f011c4" containerName="oc" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.407546 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd47ba40-7b8e-4f2c-8e16-62a5f085def8" containerName="extract" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.430199 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn"] Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.430551 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.587841 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5f79273-4f52-4d9f-ab31-5af0123ff34c-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn\" (UID: \"c5f79273-4f52-4d9f-ab31-5af0123ff34c\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.587996 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xzzn\" (UniqueName: \"kubernetes.io/projected/c5f79273-4f52-4d9f-ab31-5af0123ff34c-kube-api-access-7xzzn\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn\" (UID: \"c5f79273-4f52-4d9f-ab31-5af0123ff34c\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.588062 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5f79273-4f52-4d9f-ab31-5af0123ff34c-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn\" (UID: \"c5f79273-4f52-4d9f-ab31-5af0123ff34c\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.592644 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" event={"ID":"dd47ba40-7b8e-4f2c-8e16-62a5f085def8","Type":"ContainerDied","Data":"87b7eae02562b9b861197a3a98f6e647d0bf6b7eeb198e19eb3ec8c7323275d2"} Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.592906 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87b7eae02562b9b861197a3a98f6e647d0bf6b7eeb198e19eb3ec8c7323275d2" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.592825 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.689939 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7xzzn\" (UniqueName: \"kubernetes.io/projected/c5f79273-4f52-4d9f-ab31-5af0123ff34c-kube-api-access-7xzzn\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn\" (UID: \"c5f79273-4f52-4d9f-ab31-5af0123ff34c\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.690288 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5f79273-4f52-4d9f-ab31-5af0123ff34c-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn\" (UID: \"c5f79273-4f52-4d9f-ab31-5af0123ff34c\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.690443 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5f79273-4f52-4d9f-ab31-5af0123ff34c-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn\" (UID: \"c5f79273-4f52-4d9f-ab31-5af0123ff34c\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.691369 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5f79273-4f52-4d9f-ab31-5af0123ff34c-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn\" (UID: \"c5f79273-4f52-4d9f-ab31-5af0123ff34c\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.691475 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5f79273-4f52-4d9f-ab31-5af0123ff34c-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn\" (UID: \"c5f79273-4f52-4d9f-ab31-5af0123ff34c\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.712320 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xzzn\" (UniqueName: \"kubernetes.io/projected/c5f79273-4f52-4d9f-ab31-5af0123ff34c-kube-api-access-7xzzn\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn\" (UID: \"c5f79273-4f52-4d9f-ab31-5af0123ff34c\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.754937 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.949674 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn"] Mar 22 00:20:09 crc kubenswrapper[5116]: I0322 00:20:09.200751 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr"] Mar 22 00:20:09 crc kubenswrapper[5116]: I0322 00:20:09.210266 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" Mar 22 00:20:09 crc kubenswrapper[5116]: I0322 00:20:09.216188 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr"] Mar 22 00:20:09 crc kubenswrapper[5116]: I0322 00:20:09.402061 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01e2c74b-adcf-45a0-ab9a-e7375676f470-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr\" (UID: \"01e2c74b-adcf-45a0-ab9a-e7375676f470\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" Mar 22 00:20:09 crc kubenswrapper[5116]: I0322 00:20:09.402229 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01e2c74b-adcf-45a0-ab9a-e7375676f470-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr\" (UID: \"01e2c74b-adcf-45a0-ab9a-e7375676f470\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" Mar 22 00:20:09 crc kubenswrapper[5116]: I0322 00:20:09.402278 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk68q\" (UniqueName: \"kubernetes.io/projected/01e2c74b-adcf-45a0-ab9a-e7375676f470-kube-api-access-tk68q\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr\" (UID: \"01e2c74b-adcf-45a0-ab9a-e7375676f470\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" Mar 22 00:20:10 crc kubenswrapper[5116]: I0322 00:20:10.881996 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01e2c74b-adcf-45a0-ab9a-e7375676f470-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr\" (UID: \"01e2c74b-adcf-45a0-ab9a-e7375676f470\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" Mar 22 00:20:10 crc kubenswrapper[5116]: I0322 00:20:10.882085 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01e2c74b-adcf-45a0-ab9a-e7375676f470-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr\" (UID: \"01e2c74b-adcf-45a0-ab9a-e7375676f470\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" Mar 22 00:20:10 crc kubenswrapper[5116]: I0322 00:20:10.882110 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tk68q\" (UniqueName: \"kubernetes.io/projected/01e2c74b-adcf-45a0-ab9a-e7375676f470-kube-api-access-tk68q\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr\" (UID: \"01e2c74b-adcf-45a0-ab9a-e7375676f470\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" Mar 22 00:20:10 crc kubenswrapper[5116]: I0322 00:20:10.887541 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01e2c74b-adcf-45a0-ab9a-e7375676f470-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr\" (UID: \"01e2c74b-adcf-45a0-ab9a-e7375676f470\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" Mar 22 00:20:10 crc kubenswrapper[5116]: I0322 00:20:10.888756 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01e2c74b-adcf-45a0-ab9a-e7375676f470-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr\" (UID: \"01e2c74b-adcf-45a0-ab9a-e7375676f470\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" Mar 22 00:20:10 crc kubenswrapper[5116]: I0322 00:20:10.925396 5116 generic.go:358] "Generic (PLEG): container finished" podID="c5f79273-4f52-4d9f-ab31-5af0123ff34c" containerID="11a3914182db445f6a9e39e4bc97e7f3dd08c68c9a0b646cac172ef7b9d21b07" exitCode=0 Mar 22 00:20:10 crc kubenswrapper[5116]: I0322 00:20:10.927950 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" event={"ID":"c5f79273-4f52-4d9f-ab31-5af0123ff34c","Type":"ContainerDied","Data":"11a3914182db445f6a9e39e4bc97e7f3dd08c68c9a0b646cac172ef7b9d21b07"} Mar 22 00:20:10 crc kubenswrapper[5116]: I0322 00:20:10.928009 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" event={"ID":"c5f79273-4f52-4d9f-ab31-5af0123ff34c","Type":"ContainerStarted","Data":"611276c02629ef0f5a6f5a4b86e87c592c692b0303f6a16722779b8cf0191900"} Mar 22 00:20:10 crc kubenswrapper[5116]: I0322 00:20:10.950769 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk68q\" (UniqueName: \"kubernetes.io/projected/01e2c74b-adcf-45a0-ab9a-e7375676f470-kube-api-access-tk68q\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr\" (UID: \"01e2c74b-adcf-45a0-ab9a-e7375676f470\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" Mar 22 00:20:11 crc kubenswrapper[5116]: I0322 00:20:11.093920 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" Mar 22 00:20:11 crc kubenswrapper[5116]: I0322 00:20:11.327918 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr"] Mar 22 00:20:11 crc kubenswrapper[5116]: I0322 00:20:11.931902 5116 generic.go:358] "Generic (PLEG): container finished" podID="01e2c74b-adcf-45a0-ab9a-e7375676f470" containerID="02b484c4931a69a97cd0e50e5653d639392ec80ded43684e04efe8cb32713b66" exitCode=0 Mar 22 00:20:11 crc kubenswrapper[5116]: I0322 00:20:11.932102 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" event={"ID":"01e2c74b-adcf-45a0-ab9a-e7375676f470","Type":"ContainerDied","Data":"02b484c4931a69a97cd0e50e5653d639392ec80ded43684e04efe8cb32713b66"} Mar 22 00:20:11 crc kubenswrapper[5116]: I0322 00:20:11.932391 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" event={"ID":"01e2c74b-adcf-45a0-ab9a-e7375676f470","Type":"ContainerStarted","Data":"8f102e5b70ff913d33c6363aceb7b3c4ec4bc9b1fbc6f7dce795f8e50c8f34ca"} Mar 22 00:20:12 crc kubenswrapper[5116]: I0322 00:20:12.941285 5116 generic.go:358] "Generic (PLEG): container finished" podID="c5f79273-4f52-4d9f-ab31-5af0123ff34c" containerID="e9d4922302c59ed93bcb0147ed832a245de045f022360469768b996c2d4dd48a" exitCode=0 Mar 22 00:20:12 crc kubenswrapper[5116]: I0322 00:20:12.941342 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" event={"ID":"c5f79273-4f52-4d9f-ab31-5af0123ff34c","Type":"ContainerDied","Data":"e9d4922302c59ed93bcb0147ed832a245de045f022360469768b996c2d4dd48a"} Mar 22 00:20:12 crc kubenswrapper[5116]: I0322 00:20:12.944512 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" event={"ID":"01e2c74b-adcf-45a0-ab9a-e7375676f470","Type":"ContainerStarted","Data":"0472789dcb22aa7f57ab96d89df83c568bffeabedf523fab6d1804f1f1e4b421"} Mar 22 00:20:13 crc kubenswrapper[5116]: I0322 00:20:13.953539 5116 generic.go:358] "Generic (PLEG): container finished" podID="c5f79273-4f52-4d9f-ab31-5af0123ff34c" containerID="4d5ea94b2e5d05239242984b64e2bd61b5bb1b1b7acfa1ab37e0108951c2af8d" exitCode=0 Mar 22 00:20:13 crc kubenswrapper[5116]: I0322 00:20:13.953620 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" event={"ID":"c5f79273-4f52-4d9f-ab31-5af0123ff34c","Type":"ContainerDied","Data":"4d5ea94b2e5d05239242984b64e2bd61b5bb1b1b7acfa1ab37e0108951c2af8d"} Mar 22 00:20:13 crc kubenswrapper[5116]: I0322 00:20:13.956761 5116 generic.go:358] "Generic (PLEG): container finished" podID="01e2c74b-adcf-45a0-ab9a-e7375676f470" containerID="0472789dcb22aa7f57ab96d89df83c568bffeabedf523fab6d1804f1f1e4b421" exitCode=0 Mar 22 00:20:13 crc kubenswrapper[5116]: I0322 00:20:13.956805 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" event={"ID":"01e2c74b-adcf-45a0-ab9a-e7375676f470","Type":"ContainerDied","Data":"0472789dcb22aa7f57ab96d89df83c568bffeabedf523fab6d1804f1f1e4b421"} Mar 22 00:20:14 crc kubenswrapper[5116]: I0322 00:20:14.966654 5116 generic.go:358] "Generic (PLEG): container finished" podID="01e2c74b-adcf-45a0-ab9a-e7375676f470" containerID="2bd61c7843ffe7c87bc9273e0c4e5b0529f922d0bb45e5d57ffe32cbf56fbf16" exitCode=0 Mar 22 00:20:14 crc kubenswrapper[5116]: I0322 00:20:14.966779 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" event={"ID":"01e2c74b-adcf-45a0-ab9a-e7375676f470","Type":"ContainerDied","Data":"2bd61c7843ffe7c87bc9273e0c4e5b0529f922d0bb45e5d57ffe32cbf56fbf16"} Mar 22 00:20:15 crc kubenswrapper[5116]: I0322 00:20:15.304380 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" Mar 22 00:20:15 crc kubenswrapper[5116]: I0322 00:20:15.366860 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5f79273-4f52-4d9f-ab31-5af0123ff34c-util\") pod \"c5f79273-4f52-4d9f-ab31-5af0123ff34c\" (UID: \"c5f79273-4f52-4d9f-ab31-5af0123ff34c\") " Mar 22 00:20:15 crc kubenswrapper[5116]: I0322 00:20:15.367067 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xzzn\" (UniqueName: \"kubernetes.io/projected/c5f79273-4f52-4d9f-ab31-5af0123ff34c-kube-api-access-7xzzn\") pod \"c5f79273-4f52-4d9f-ab31-5af0123ff34c\" (UID: \"c5f79273-4f52-4d9f-ab31-5af0123ff34c\") " Mar 22 00:20:15 crc kubenswrapper[5116]: I0322 00:20:15.367113 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5f79273-4f52-4d9f-ab31-5af0123ff34c-bundle\") pod \"c5f79273-4f52-4d9f-ab31-5af0123ff34c\" (UID: \"c5f79273-4f52-4d9f-ab31-5af0123ff34c\") " Mar 22 00:20:15 crc kubenswrapper[5116]: I0322 00:20:15.377426 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5f79273-4f52-4d9f-ab31-5af0123ff34c-bundle" (OuterVolumeSpecName: "bundle") pod "c5f79273-4f52-4d9f-ab31-5af0123ff34c" (UID: "c5f79273-4f52-4d9f-ab31-5af0123ff34c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:20:15 crc kubenswrapper[5116]: I0322 00:20:15.388540 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5f79273-4f52-4d9f-ab31-5af0123ff34c-kube-api-access-7xzzn" (OuterVolumeSpecName: "kube-api-access-7xzzn") pod "c5f79273-4f52-4d9f-ab31-5af0123ff34c" (UID: "c5f79273-4f52-4d9f-ab31-5af0123ff34c"). InnerVolumeSpecName "kube-api-access-7xzzn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:20:15 crc kubenswrapper[5116]: I0322 00:20:15.394282 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5f79273-4f52-4d9f-ab31-5af0123ff34c-util" (OuterVolumeSpecName: "util") pod "c5f79273-4f52-4d9f-ab31-5af0123ff34c" (UID: "c5f79273-4f52-4d9f-ab31-5af0123ff34c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:20:15 crc kubenswrapper[5116]: I0322 00:20:15.469706 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7xzzn\" (UniqueName: \"kubernetes.io/projected/c5f79273-4f52-4d9f-ab31-5af0123ff34c-kube-api-access-7xzzn\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:15 crc kubenswrapper[5116]: I0322 00:20:15.469761 5116 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5f79273-4f52-4d9f-ab31-5af0123ff34c-bundle\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:15 crc kubenswrapper[5116]: I0322 00:20:15.469771 5116 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5f79273-4f52-4d9f-ab31-5af0123ff34c-util\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:15 crc kubenswrapper[5116]: I0322 00:20:15.975621 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" Mar 22 00:20:15 crc kubenswrapper[5116]: I0322 00:20:15.975644 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" event={"ID":"c5f79273-4f52-4d9f-ab31-5af0123ff34c","Type":"ContainerDied","Data":"611276c02629ef0f5a6f5a4b86e87c592c692b0303f6a16722779b8cf0191900"} Mar 22 00:20:15 crc kubenswrapper[5116]: I0322 00:20:15.975733 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="611276c02629ef0f5a6f5a4b86e87c592c692b0303f6a16722779b8cf0191900" Mar 22 00:20:16 crc kubenswrapper[5116]: I0322 00:20:16.333877 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" Mar 22 00:20:16 crc kubenswrapper[5116]: I0322 00:20:16.382240 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01e2c74b-adcf-45a0-ab9a-e7375676f470-bundle\") pod \"01e2c74b-adcf-45a0-ab9a-e7375676f470\" (UID: \"01e2c74b-adcf-45a0-ab9a-e7375676f470\") " Mar 22 00:20:16 crc kubenswrapper[5116]: I0322 00:20:16.382389 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01e2c74b-adcf-45a0-ab9a-e7375676f470-util\") pod \"01e2c74b-adcf-45a0-ab9a-e7375676f470\" (UID: \"01e2c74b-adcf-45a0-ab9a-e7375676f470\") " Mar 22 00:20:16 crc kubenswrapper[5116]: I0322 00:20:16.382435 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk68q\" (UniqueName: \"kubernetes.io/projected/01e2c74b-adcf-45a0-ab9a-e7375676f470-kube-api-access-tk68q\") pod \"01e2c74b-adcf-45a0-ab9a-e7375676f470\" (UID: \"01e2c74b-adcf-45a0-ab9a-e7375676f470\") " Mar 22 00:20:16 crc kubenswrapper[5116]: I0322 00:20:16.383133 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01e2c74b-adcf-45a0-ab9a-e7375676f470-bundle" (OuterVolumeSpecName: "bundle") pod "01e2c74b-adcf-45a0-ab9a-e7375676f470" (UID: "01e2c74b-adcf-45a0-ab9a-e7375676f470"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:20:16 crc kubenswrapper[5116]: I0322 00:20:16.390247 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01e2c74b-adcf-45a0-ab9a-e7375676f470-kube-api-access-tk68q" (OuterVolumeSpecName: "kube-api-access-tk68q") pod "01e2c74b-adcf-45a0-ab9a-e7375676f470" (UID: "01e2c74b-adcf-45a0-ab9a-e7375676f470"). InnerVolumeSpecName "kube-api-access-tk68q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:20:16 crc kubenswrapper[5116]: I0322 00:20:16.397071 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01e2c74b-adcf-45a0-ab9a-e7375676f470-util" (OuterVolumeSpecName: "util") pod "01e2c74b-adcf-45a0-ab9a-e7375676f470" (UID: "01e2c74b-adcf-45a0-ab9a-e7375676f470"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:20:16 crc kubenswrapper[5116]: I0322 00:20:16.483967 5116 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01e2c74b-adcf-45a0-ab9a-e7375676f470-bundle\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:16 crc kubenswrapper[5116]: I0322 00:20:16.484009 5116 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01e2c74b-adcf-45a0-ab9a-e7375676f470-util\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:16 crc kubenswrapper[5116]: I0322 00:20:16.484021 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tk68q\" (UniqueName: \"kubernetes.io/projected/01e2c74b-adcf-45a0-ab9a-e7375676f470-kube-api-access-tk68q\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:16 crc kubenswrapper[5116]: I0322 00:20:16.983960 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" Mar 22 00:20:16 crc kubenswrapper[5116]: I0322 00:20:16.983956 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" event={"ID":"01e2c74b-adcf-45a0-ab9a-e7375676f470","Type":"ContainerDied","Data":"8f102e5b70ff913d33c6363aceb7b3c4ec4bc9b1fbc6f7dce795f8e50c8f34ca"} Mar 22 00:20:16 crc kubenswrapper[5116]: I0322 00:20:16.984096 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f102e5b70ff913d33c6363aceb7b3c4ec4bc9b1fbc6f7dce795f8e50c8f34ca" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.004904 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5"] Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.005459 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01e2c74b-adcf-45a0-ab9a-e7375676f470" containerName="util" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.005478 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e2c74b-adcf-45a0-ab9a-e7375676f470" containerName="util" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.005492 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5f79273-4f52-4d9f-ab31-5af0123ff34c" containerName="pull" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.005498 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5f79273-4f52-4d9f-ab31-5af0123ff34c" containerName="pull" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.005511 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5f79273-4f52-4d9f-ab31-5af0123ff34c" containerName="util" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.005518 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5f79273-4f52-4d9f-ab31-5af0123ff34c" containerName="util" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.005528 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5f79273-4f52-4d9f-ab31-5af0123ff34c" containerName="extract" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.005533 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5f79273-4f52-4d9f-ab31-5af0123ff34c" containerName="extract" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.005564 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01e2c74b-adcf-45a0-ab9a-e7375676f470" containerName="extract" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.005571 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e2c74b-adcf-45a0-ab9a-e7375676f470" containerName="extract" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.005584 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01e2c74b-adcf-45a0-ab9a-e7375676f470" containerName="pull" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.005592 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e2c74b-adcf-45a0-ab9a-e7375676f470" containerName="pull" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.005690 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="c5f79273-4f52-4d9f-ab31-5af0123ff34c" containerName="extract" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.005703 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="01e2c74b-adcf-45a0-ab9a-e7375676f470" containerName="extract" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.013050 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.029508 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-b2ccr\"" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.045853 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5"] Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.094108 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5\" (UID: \"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.094175 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5\" (UID: \"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.094262 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtlkj\" (UniqueName: \"kubernetes.io/projected/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-kube-api-access-xtlkj\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5\" (UID: \"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.195568 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xtlkj\" (UniqueName: \"kubernetes.io/projected/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-kube-api-access-xtlkj\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5\" (UID: \"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.195815 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5\" (UID: \"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.195860 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5\" (UID: \"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.196364 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5\" (UID: \"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.196499 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5\" (UID: \"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.238454 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtlkj\" (UniqueName: \"kubernetes.io/projected/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-kube-api-access-xtlkj\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5\" (UID: \"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.327359 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.844470 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5"] Mar 22 00:20:17 crc kubenswrapper[5116]: W0322 00:20:17.862873 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e8a1901_425e_4555_a4f0_fd2ae65d7fb8.slice/crio-306e4f01729ad8bbfcae9ed407b8f8ebfb81cfe346d54b8c7c4121e0ad254dc8 WatchSource:0}: Error finding container 306e4f01729ad8bbfcae9ed407b8f8ebfb81cfe346d54b8c7c4121e0ad254dc8: Status 404 returned error can't find the container with id 306e4f01729ad8bbfcae9ed407b8f8ebfb81cfe346d54b8c7c4121e0ad254dc8 Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.992033 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" event={"ID":"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8","Type":"ContainerStarted","Data":"306e4f01729ad8bbfcae9ed407b8f8ebfb81cfe346d54b8c7c4121e0ad254dc8"} Mar 22 00:20:19 crc kubenswrapper[5116]: I0322 00:20:19.001509 5116 generic.go:358] "Generic (PLEG): container finished" podID="7e8a1901-425e-4555-a4f0-fd2ae65d7fb8" containerID="7812ba22ca5c8e825adbaf832ba6d4439e56aada267ef4753eebb6d66c29afb3" exitCode=0 Mar 22 00:20:19 crc kubenswrapper[5116]: I0322 00:20:19.002084 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" event={"ID":"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8","Type":"ContainerDied","Data":"7812ba22ca5c8e825adbaf832ba6d4439e56aada267ef4753eebb6d66c29afb3"} Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.133644 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-55568fc96c-krbrc"] Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.141924 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-55568fc96c-krbrc" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.145797 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.146233 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"obo-prometheus-operator-dockercfg-8g6jf\"" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.146486 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.157986 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-55568fc96c-krbrc"] Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.243382 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqfgm\" (UniqueName: \"kubernetes.io/projected/1a434146-4e47-4733-9f73-955a4c92f2d2-kube-api-access-gqfgm\") pod \"obo-prometheus-operator-55568fc96c-krbrc\" (UID: \"1a434146-4e47-4733-9f73-955a4c92f2d2\") " pod="openshift-operators/obo-prometheus-operator-55568fc96c-krbrc" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.345136 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqfgm\" (UniqueName: \"kubernetes.io/projected/1a434146-4e47-4733-9f73-955a4c92f2d2-kube-api-access-gqfgm\") pod \"obo-prometheus-operator-55568fc96c-krbrc\" (UID: \"1a434146-4e47-4733-9f73-955a4c92f2d2\") " pod="openshift-operators/obo-prometheus-operator-55568fc96c-krbrc" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.380204 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqfgm\" (UniqueName: \"kubernetes.io/projected/1a434146-4e47-4733-9f73-955a4c92f2d2-kube-api-access-gqfgm\") pod \"obo-prometheus-operator-55568fc96c-krbrc\" (UID: \"1a434146-4e47-4733-9f73-955a4c92f2d2\") " pod="openshift-operators/obo-prometheus-operator-55568fc96c-krbrc" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.459629 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-55568fc96c-krbrc" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.465744 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf"] Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.470318 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.472804 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"obo-prometheus-operator-admission-webhook-dockercfg-9cpr6\"" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.473206 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"obo-prometheus-operator-admission-webhook-service-cert\"" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.484221 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf"] Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.494619 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd"] Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.502995 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.528364 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd"] Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.548724 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d0f143c-b305-43e1-937e-020d84101219-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf\" (UID: \"2d0f143c-b305-43e1-937e-020d84101219\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.549283 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d0f143c-b305-43e1-937e-020d84101219-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf\" (UID: \"2d0f143c-b305-43e1-937e-020d84101219\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.650993 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d0f143c-b305-43e1-937e-020d84101219-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf\" (UID: \"2d0f143c-b305-43e1-937e-020d84101219\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.651096 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/399d0e55-3ad7-48ad-ab17-d0ab1fb9879f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd\" (UID: \"399d0e55-3ad7-48ad-ab17-d0ab1fb9879f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.651143 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/399d0e55-3ad7-48ad-ab17-d0ab1fb9879f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd\" (UID: \"399d0e55-3ad7-48ad-ab17-d0ab1fb9879f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.651204 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d0f143c-b305-43e1-937e-020d84101219-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf\" (UID: \"2d0f143c-b305-43e1-937e-020d84101219\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.678759 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d0f143c-b305-43e1-937e-020d84101219-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf\" (UID: \"2d0f143c-b305-43e1-937e-020d84101219\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.683268 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d0f143c-b305-43e1-937e-020d84101219-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf\" (UID: \"2d0f143c-b305-43e1-937e-020d84101219\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.755859 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/399d0e55-3ad7-48ad-ab17-d0ab1fb9879f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd\" (UID: \"399d0e55-3ad7-48ad-ab17-d0ab1fb9879f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.755918 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/399d0e55-3ad7-48ad-ab17-d0ab1fb9879f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd\" (UID: \"399d0e55-3ad7-48ad-ab17-d0ab1fb9879f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.764739 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/399d0e55-3ad7-48ad-ab17-d0ab1fb9879f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd\" (UID: \"399d0e55-3ad7-48ad-ab17-d0ab1fb9879f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.768580 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/399d0e55-3ad7-48ad-ab17-d0ab1fb9879f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd\" (UID: \"399d0e55-3ad7-48ad-ab17-d0ab1fb9879f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.834299 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.845114 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.851597 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-55568fc96c-krbrc"] Mar 22 00:20:20 crc kubenswrapper[5116]: W0322 00:20:20.892025 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a434146_4e47_4733_9f73_955a4c92f2d2.slice/crio-ecf0077f042be66d0f2f0f3e81456e7ae9603ae07d97cfd619efb877f31f6398 WatchSource:0}: Error finding container ecf0077f042be66d0f2f0f3e81456e7ae9603ae07d97cfd619efb877f31f6398: Status 404 returned error can't find the container with id ecf0077f042be66d0f2f0f3e81456e7ae9603ae07d97cfd619efb877f31f6398 Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.027385 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-55568fc96c-krbrc" event={"ID":"1a434146-4e47-4733-9f73-955a4c92f2d2","Type":"ContainerStarted","Data":"ecf0077f042be66d0f2f0f3e81456e7ae9603ae07d97cfd619efb877f31f6398"} Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.106223 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-587f9c8867-sxrpm"] Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.118810 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-587f9c8867-sxrpm" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.122588 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"observability-operator-tls\"" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.125902 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-587f9c8867-sxrpm"] Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.132613 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"observability-operator-sa-dockercfg-fsqpl\"" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.202409 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf"] Mar 22 00:20:21 crc kubenswrapper[5116]: W0322 00:20:21.216953 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d0f143c_b305_43e1_937e_020d84101219.slice/crio-e5f7817d368ac6b93e25f4608abba19675448038b7b4e589127d6bf0de5ee6a0 WatchSource:0}: Error finding container e5f7817d368ac6b93e25f4608abba19675448038b7b4e589127d6bf0de5ee6a0: Status 404 returned error can't find the container with id e5f7817d368ac6b93e25f4608abba19675448038b7b4e589127d6bf0de5ee6a0 Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.281502 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcxvb\" (UniqueName: \"kubernetes.io/projected/b998a8ef-dbc2-4004-a589-608b0bf774e7-kube-api-access-mcxvb\") pod \"observability-operator-587f9c8867-sxrpm\" (UID: \"b998a8ef-dbc2-4004-a589-608b0bf774e7\") " pod="openshift-operators/observability-operator-587f9c8867-sxrpm" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.281741 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/b998a8ef-dbc2-4004-a589-608b0bf774e7-observability-operator-tls\") pod \"observability-operator-587f9c8867-sxrpm\" (UID: \"b998a8ef-dbc2-4004-a589-608b0bf774e7\") " pod="openshift-operators/observability-operator-587f9c8867-sxrpm" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.382976 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mcxvb\" (UniqueName: \"kubernetes.io/projected/b998a8ef-dbc2-4004-a589-608b0bf774e7-kube-api-access-mcxvb\") pod \"observability-operator-587f9c8867-sxrpm\" (UID: \"b998a8ef-dbc2-4004-a589-608b0bf774e7\") " pod="openshift-operators/observability-operator-587f9c8867-sxrpm" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.383503 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/b998a8ef-dbc2-4004-a589-608b0bf774e7-observability-operator-tls\") pod \"observability-operator-587f9c8867-sxrpm\" (UID: \"b998a8ef-dbc2-4004-a589-608b0bf774e7\") " pod="openshift-operators/observability-operator-587f9c8867-sxrpm" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.389714 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/b998a8ef-dbc2-4004-a589-608b0bf774e7-observability-operator-tls\") pod \"observability-operator-587f9c8867-sxrpm\" (UID: \"b998a8ef-dbc2-4004-a589-608b0bf774e7\") " pod="openshift-operators/observability-operator-587f9c8867-sxrpm" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.404505 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcxvb\" (UniqueName: \"kubernetes.io/projected/b998a8ef-dbc2-4004-a589-608b0bf774e7-kube-api-access-mcxvb\") pod \"observability-operator-587f9c8867-sxrpm\" (UID: \"b998a8ef-dbc2-4004-a589-608b0bf774e7\") " pod="openshift-operators/observability-operator-587f9c8867-sxrpm" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.462649 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-587f9c8867-sxrpm" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.479292 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bff5dbc55-tpg7b"] Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.496432 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.500618 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"perses-operator-dockercfg-nftvh\"" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.502932 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"perses-operator-service-cert\"" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.502966 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bff5dbc55-tpg7b"] Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.540014 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd"] Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.541793 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.586346 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8-webhook-cert\") pod \"perses-operator-5bff5dbc55-tpg7b\" (UID: \"9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8\") " pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.586386 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6k9t\" (UniqueName: \"kubernetes.io/projected/9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8-kube-api-access-m6k9t\") pod \"perses-operator-5bff5dbc55-tpg7b\" (UID: \"9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8\") " pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.586427 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8-apiservice-cert\") pod \"perses-operator-5bff5dbc55-tpg7b\" (UID: \"9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8\") " pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.586470 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8-openshift-service-ca\") pod \"perses-operator-5bff5dbc55-tpg7b\" (UID: \"9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8\") " pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.624755 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-66587d64c8-zwkhp"] Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.691186 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8-webhook-cert\") pod \"perses-operator-5bff5dbc55-tpg7b\" (UID: \"9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8\") " pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.691232 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6k9t\" (UniqueName: \"kubernetes.io/projected/9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8-kube-api-access-m6k9t\") pod \"perses-operator-5bff5dbc55-tpg7b\" (UID: \"9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8\") " pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.691274 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8-apiservice-cert\") pod \"perses-operator-5bff5dbc55-tpg7b\" (UID: \"9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8\") " pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.691332 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8-openshift-service-ca\") pod \"perses-operator-5bff5dbc55-tpg7b\" (UID: \"9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8\") " pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.693239 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8-openshift-service-ca\") pod \"perses-operator-5bff5dbc55-tpg7b\" (UID: \"9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8\") " pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.701629 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8-apiservice-cert\") pod \"perses-operator-5bff5dbc55-tpg7b\" (UID: \"9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8\") " pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.709018 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8-webhook-cert\") pod \"perses-operator-5bff5dbc55-tpg7b\" (UID: \"9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8\") " pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.753804 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6k9t\" (UniqueName: \"kubernetes.io/projected/9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8-kube-api-access-m6k9t\") pod \"perses-operator-5bff5dbc55-tpg7b\" (UID: \"9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8\") " pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.840478 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.998573 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-587f9c8867-sxrpm"] Mar 22 00:20:22 crc kubenswrapper[5116]: I0322 00:20:22.051192 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf" event={"ID":"2d0f143c-b305-43e1-937e-020d84101219","Type":"ContainerStarted","Data":"e5f7817d368ac6b93e25f4608abba19675448038b7b4e589127d6bf0de5ee6a0"} Mar 22 00:20:22 crc kubenswrapper[5116]: I0322 00:20:22.062304 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd" event={"ID":"399d0e55-3ad7-48ad-ab17-d0ab1fb9879f","Type":"ContainerStarted","Data":"46d0188e1e2cdb419883fb6b789d161cdd04754e91f713391cfb9e073d58d375"} Mar 22 00:20:22 crc kubenswrapper[5116]: I0322 00:20:22.534156 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bff5dbc55-tpg7b"] Mar 22 00:20:23 crc kubenswrapper[5116]: I0322 00:20:23.076123 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" event={"ID":"9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8","Type":"ContainerStarted","Data":"729be4ebb6d35cfc2cdd555bbb3c39b505dbc03a2911d424cd5d0a142b7d3a72"} Mar 22 00:20:23 crc kubenswrapper[5116]: I0322 00:20:23.078249 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-587f9c8867-sxrpm" event={"ID":"b998a8ef-dbc2-4004-a589-608b0bf774e7","Type":"ContainerStarted","Data":"86654e9bd3a4bdcad31031b04ded59f594b12c4984c95eb018db3d270a78c385"} Mar 22 00:20:24 crc kubenswrapper[5116]: I0322 00:20:24.812385 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/elastic-operator-58c4bc569-nwp4h"] Mar 22 00:20:24 crc kubenswrapper[5116]: I0322 00:20:24.907359 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-58c4bc569-nwp4h"] Mar 22 00:20:24 crc kubenswrapper[5116]: I0322 00:20:24.907527 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-58c4bc569-nwp4h" Mar 22 00:20:24 crc kubenswrapper[5116]: I0322 00:20:24.910794 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"openshift-service-ca.crt\"" Mar 22 00:20:24 crc kubenswrapper[5116]: I0322 00:20:24.911074 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"elastic-operator-service-cert\"" Mar 22 00:20:24 crc kubenswrapper[5116]: I0322 00:20:24.911067 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"elastic-operator-dockercfg-jvfh8\"" Mar 22 00:20:24 crc kubenswrapper[5116]: I0322 00:20:24.912221 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"kube-root-ca.crt\"" Mar 22 00:20:25 crc kubenswrapper[5116]: I0322 00:20:25.066625 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d6bec193-8107-440f-89aa-944885708496-webhook-cert\") pod \"elastic-operator-58c4bc569-nwp4h\" (UID: \"d6bec193-8107-440f-89aa-944885708496\") " pod="service-telemetry/elastic-operator-58c4bc569-nwp4h" Mar 22 00:20:25 crc kubenswrapper[5116]: I0322 00:20:25.066770 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d6bec193-8107-440f-89aa-944885708496-apiservice-cert\") pod \"elastic-operator-58c4bc569-nwp4h\" (UID: \"d6bec193-8107-440f-89aa-944885708496\") " pod="service-telemetry/elastic-operator-58c4bc569-nwp4h" Mar 22 00:20:25 crc kubenswrapper[5116]: I0322 00:20:25.066820 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27dxj\" (UniqueName: \"kubernetes.io/projected/d6bec193-8107-440f-89aa-944885708496-kube-api-access-27dxj\") pod \"elastic-operator-58c4bc569-nwp4h\" (UID: \"d6bec193-8107-440f-89aa-944885708496\") " pod="service-telemetry/elastic-operator-58c4bc569-nwp4h" Mar 22 00:20:25 crc kubenswrapper[5116]: I0322 00:20:25.167930 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27dxj\" (UniqueName: \"kubernetes.io/projected/d6bec193-8107-440f-89aa-944885708496-kube-api-access-27dxj\") pod \"elastic-operator-58c4bc569-nwp4h\" (UID: \"d6bec193-8107-440f-89aa-944885708496\") " pod="service-telemetry/elastic-operator-58c4bc569-nwp4h" Mar 22 00:20:25 crc kubenswrapper[5116]: I0322 00:20:25.168004 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d6bec193-8107-440f-89aa-944885708496-webhook-cert\") pod \"elastic-operator-58c4bc569-nwp4h\" (UID: \"d6bec193-8107-440f-89aa-944885708496\") " pod="service-telemetry/elastic-operator-58c4bc569-nwp4h" Mar 22 00:20:25 crc kubenswrapper[5116]: I0322 00:20:25.168098 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d6bec193-8107-440f-89aa-944885708496-apiservice-cert\") pod \"elastic-operator-58c4bc569-nwp4h\" (UID: \"d6bec193-8107-440f-89aa-944885708496\") " pod="service-telemetry/elastic-operator-58c4bc569-nwp4h" Mar 22 00:20:25 crc kubenswrapper[5116]: I0322 00:20:25.177819 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d6bec193-8107-440f-89aa-944885708496-webhook-cert\") pod \"elastic-operator-58c4bc569-nwp4h\" (UID: \"d6bec193-8107-440f-89aa-944885708496\") " pod="service-telemetry/elastic-operator-58c4bc569-nwp4h" Mar 22 00:20:25 crc kubenswrapper[5116]: I0322 00:20:25.179718 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d6bec193-8107-440f-89aa-944885708496-apiservice-cert\") pod \"elastic-operator-58c4bc569-nwp4h\" (UID: \"d6bec193-8107-440f-89aa-944885708496\") " pod="service-telemetry/elastic-operator-58c4bc569-nwp4h" Mar 22 00:20:25 crc kubenswrapper[5116]: I0322 00:20:25.198890 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27dxj\" (UniqueName: \"kubernetes.io/projected/d6bec193-8107-440f-89aa-944885708496-kube-api-access-27dxj\") pod \"elastic-operator-58c4bc569-nwp4h\" (UID: \"d6bec193-8107-440f-89aa-944885708496\") " pod="service-telemetry/elastic-operator-58c4bc569-nwp4h" Mar 22 00:20:25 crc kubenswrapper[5116]: I0322 00:20:25.243854 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-58c4bc569-nwp4h" Mar 22 00:20:26 crc kubenswrapper[5116]: I0322 00:20:26.569415 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/interconnect-operator-78b9bd8798-xdlcm"] Mar 22 00:20:26 crc kubenswrapper[5116]: I0322 00:20:26.589747 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-78b9bd8798-xdlcm"] Mar 22 00:20:26 crc kubenswrapper[5116]: I0322 00:20:26.589829 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-78b9bd8798-xdlcm" Mar 22 00:20:26 crc kubenswrapper[5116]: I0322 00:20:26.601036 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"interconnect-operator-dockercfg-dfbpf\"" Mar 22 00:20:26 crc kubenswrapper[5116]: I0322 00:20:26.701608 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn98v\" (UniqueName: \"kubernetes.io/projected/73441892-3e06-43b0-bb99-44e4ff5f74b9-kube-api-access-qn98v\") pod \"interconnect-operator-78b9bd8798-xdlcm\" (UID: \"73441892-3e06-43b0-bb99-44e4ff5f74b9\") " pod="service-telemetry/interconnect-operator-78b9bd8798-xdlcm" Mar 22 00:20:26 crc kubenswrapper[5116]: I0322 00:20:26.803490 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qn98v\" (UniqueName: \"kubernetes.io/projected/73441892-3e06-43b0-bb99-44e4ff5f74b9-kube-api-access-qn98v\") pod \"interconnect-operator-78b9bd8798-xdlcm\" (UID: \"73441892-3e06-43b0-bb99-44e4ff5f74b9\") " pod="service-telemetry/interconnect-operator-78b9bd8798-xdlcm" Mar 22 00:20:26 crc kubenswrapper[5116]: I0322 00:20:26.854791 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn98v\" (UniqueName: \"kubernetes.io/projected/73441892-3e06-43b0-bb99-44e4ff5f74b9-kube-api-access-qn98v\") pod \"interconnect-operator-78b9bd8798-xdlcm\" (UID: \"73441892-3e06-43b0-bb99-44e4ff5f74b9\") " pod="service-telemetry/interconnect-operator-78b9bd8798-xdlcm" Mar 22 00:20:26 crc kubenswrapper[5116]: I0322 00:20:26.937997 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-78b9bd8798-xdlcm" Mar 22 00:20:38 crc kubenswrapper[5116]: I0322 00:20:38.416247 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-58c4bc569-nwp4h"] Mar 22 00:20:38 crc kubenswrapper[5116]: W0322 00:20:38.447696 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6bec193_8107_440f_89aa_944885708496.slice/crio-c29d5a21ed50743a2727d3f589a43dc3161559525cf33b0aec00fe8b4d7903d6 WatchSource:0}: Error finding container c29d5a21ed50743a2727d3f589a43dc3161559525cf33b0aec00fe8b4d7903d6: Status 404 returned error can't find the container with id c29d5a21ed50743a2727d3f589a43dc3161559525cf33b0aec00fe8b4d7903d6 Mar 22 00:20:38 crc kubenswrapper[5116]: I0322 00:20:38.474140 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-78b9bd8798-xdlcm"] Mar 22 00:20:39 crc kubenswrapper[5116]: I0322 00:20:39.291506 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-587f9c8867-sxrpm" event={"ID":"b998a8ef-dbc2-4004-a589-608b0bf774e7","Type":"ContainerStarted","Data":"54ec4333cb3d2e1db09c103bcc0e2e3c3ecf291a9bed2554865581fdce46c1de"} Mar 22 00:20:39 crc kubenswrapper[5116]: I0322 00:20:39.292644 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/observability-operator-587f9c8867-sxrpm" Mar 22 00:20:39 crc kubenswrapper[5116]: I0322 00:20:39.297027 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-78b9bd8798-xdlcm" event={"ID":"73441892-3e06-43b0-bb99-44e4ff5f74b9","Type":"ContainerStarted","Data":"57c732f90f4bf5ab948c0d7af0f13f077b6b0d7135d178131004699f5b9d820f"} Mar 22 00:20:39 crc kubenswrapper[5116]: I0322 00:20:39.303562 5116 generic.go:358] "Generic (PLEG): container finished" podID="7e8a1901-425e-4555-a4f0-fd2ae65d7fb8" containerID="72a12537d821669d307e258cccd0d709b04bdded6776d3e5e7c34f570cfd88f0" exitCode=0 Mar 22 00:20:39 crc kubenswrapper[5116]: I0322 00:20:39.303668 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" event={"ID":"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8","Type":"ContainerDied","Data":"72a12537d821669d307e258cccd0d709b04bdded6776d3e5e7c34f570cfd88f0"} Mar 22 00:20:39 crc kubenswrapper[5116]: I0322 00:20:39.305003 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-587f9c8867-sxrpm" Mar 22 00:20:39 crc kubenswrapper[5116]: I0322 00:20:39.314370 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd" event={"ID":"399d0e55-3ad7-48ad-ab17-d0ab1fb9879f","Type":"ContainerStarted","Data":"8508c3340f68831ac8d738c014b16a782b7867bcb0454c1208a5ab81440b17b8"} Mar 22 00:20:39 crc kubenswrapper[5116]: I0322 00:20:39.324674 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-58c4bc569-nwp4h" event={"ID":"d6bec193-8107-440f-89aa-944885708496","Type":"ContainerStarted","Data":"c29d5a21ed50743a2727d3f589a43dc3161559525cf33b0aec00fe8b4d7903d6"} Mar 22 00:20:39 crc kubenswrapper[5116]: I0322 00:20:39.330997 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf" event={"ID":"2d0f143c-b305-43e1-937e-020d84101219","Type":"ContainerStarted","Data":"507d4b77ff13cc5afb466cae703417a9f31f021c083dde7a6f803f295bfddbdb"} Mar 22 00:20:39 crc kubenswrapper[5116]: I0322 00:20:39.338519 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-587f9c8867-sxrpm" podStartSLOduration=2.3046954680000002 podStartE2EDuration="18.338487605s" podCreationTimestamp="2026-03-22 00:20:21 +0000 UTC" firstStartedPulling="2026-03-22 00:20:22.065066903 +0000 UTC m=+693.087368276" lastFinishedPulling="2026-03-22 00:20:38.09885904 +0000 UTC m=+709.121160413" observedRunningTime="2026-03-22 00:20:39.32963265 +0000 UTC m=+710.351934033" watchObservedRunningTime="2026-03-22 00:20:39.338487605 +0000 UTC m=+710.360788978" Mar 22 00:20:39 crc kubenswrapper[5116]: I0322 00:20:39.341230 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-55568fc96c-krbrc" event={"ID":"1a434146-4e47-4733-9f73-955a4c92f2d2","Type":"ContainerStarted","Data":"1a2f57f8fa48bde4099bcfe0277665e66c74a352b0b99625ed585cb64c83d9ac"} Mar 22 00:20:39 crc kubenswrapper[5116]: I0322 00:20:39.343774 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" event={"ID":"9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8","Type":"ContainerStarted","Data":"6fa7a96ea3f27e4ce7da6bfa8c56107e03adea13d834bc0f2be761d178afe464"} Mar 22 00:20:39 crc kubenswrapper[5116]: I0322 00:20:39.344079 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" Mar 22 00:20:39 crc kubenswrapper[5116]: I0322 00:20:39.444871 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd" podStartSLOduration=2.915826994 podStartE2EDuration="19.444840463s" podCreationTimestamp="2026-03-22 00:20:20 +0000 UTC" firstStartedPulling="2026-03-22 00:20:21.568450576 +0000 UTC m=+692.590751949" lastFinishedPulling="2026-03-22 00:20:38.097464045 +0000 UTC m=+709.119765418" observedRunningTime="2026-03-22 00:20:39.352392463 +0000 UTC m=+710.374693866" watchObservedRunningTime="2026-03-22 00:20:39.444840463 +0000 UTC m=+710.467141836" Mar 22 00:20:39 crc kubenswrapper[5116]: I0322 00:20:39.520784 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" podStartSLOduration=2.982118432 podStartE2EDuration="18.52076251s" podCreationTimestamp="2026-03-22 00:20:21 +0000 UTC" firstStartedPulling="2026-03-22 00:20:22.581940392 +0000 UTC m=+693.604241775" lastFinishedPulling="2026-03-22 00:20:38.12058448 +0000 UTC m=+709.142885853" observedRunningTime="2026-03-22 00:20:39.510806439 +0000 UTC m=+710.533107832" watchObservedRunningTime="2026-03-22 00:20:39.52076251 +0000 UTC m=+710.543063883" Mar 22 00:20:39 crc kubenswrapper[5116]: I0322 00:20:39.554886 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-55568fc96c-krbrc" podStartSLOduration=2.351570606 podStartE2EDuration="19.554861179s" podCreationTimestamp="2026-03-22 00:20:20 +0000 UTC" firstStartedPulling="2026-03-22 00:20:20.895562327 +0000 UTC m=+691.917863700" lastFinishedPulling="2026-03-22 00:20:38.0988529 +0000 UTC m=+709.121154273" observedRunningTime="2026-03-22 00:20:39.537333075 +0000 UTC m=+710.559634448" watchObservedRunningTime="2026-03-22 00:20:39.554861179 +0000 UTC m=+710.577162552" Mar 22 00:20:39 crc kubenswrapper[5116]: I0322 00:20:39.591119 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf" podStartSLOduration=2.692242458 podStartE2EDuration="19.590928872s" podCreationTimestamp="2026-03-22 00:20:20 +0000 UTC" firstStartedPulling="2026-03-22 00:20:21.221889476 +0000 UTC m=+692.244190849" lastFinishedPulling="2026-03-22 00:20:38.12057587 +0000 UTC m=+709.142877263" observedRunningTime="2026-03-22 00:20:39.58156837 +0000 UTC m=+710.603869743" watchObservedRunningTime="2026-03-22 00:20:39.590928872 +0000 UTC m=+710.613230245" Mar 22 00:20:40 crc kubenswrapper[5116]: I0322 00:20:40.358437 5116 generic.go:358] "Generic (PLEG): container finished" podID="7e8a1901-425e-4555-a4f0-fd2ae65d7fb8" containerID="036133fcef38482e0fc86cfb1126c8a7e10691ba008834fd95bf6087c849dbbf" exitCode=0 Mar 22 00:20:40 crc kubenswrapper[5116]: I0322 00:20:40.359333 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" event={"ID":"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8","Type":"ContainerDied","Data":"036133fcef38482e0fc86cfb1126c8a7e10691ba008834fd95bf6087c849dbbf"} Mar 22 00:20:43 crc kubenswrapper[5116]: I0322 00:20:43.763687 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" Mar 22 00:20:43 crc kubenswrapper[5116]: I0322 00:20:43.911378 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtlkj\" (UniqueName: \"kubernetes.io/projected/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-kube-api-access-xtlkj\") pod \"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8\" (UID: \"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8\") " Mar 22 00:20:43 crc kubenswrapper[5116]: I0322 00:20:43.911635 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-bundle\") pod \"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8\" (UID: \"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8\") " Mar 22 00:20:43 crc kubenswrapper[5116]: I0322 00:20:43.911717 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-util\") pod \"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8\" (UID: \"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8\") " Mar 22 00:20:43 crc kubenswrapper[5116]: I0322 00:20:43.914029 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-bundle" (OuterVolumeSpecName: "bundle") pod "7e8a1901-425e-4555-a4f0-fd2ae65d7fb8" (UID: "7e8a1901-425e-4555-a4f0-fd2ae65d7fb8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:20:43 crc kubenswrapper[5116]: I0322 00:20:43.930355 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-util" (OuterVolumeSpecName: "util") pod "7e8a1901-425e-4555-a4f0-fd2ae65d7fb8" (UID: "7e8a1901-425e-4555-a4f0-fd2ae65d7fb8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:20:43 crc kubenswrapper[5116]: I0322 00:20:43.936441 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-kube-api-access-xtlkj" (OuterVolumeSpecName: "kube-api-access-xtlkj") pod "7e8a1901-425e-4555-a4f0-fd2ae65d7fb8" (UID: "7e8a1901-425e-4555-a4f0-fd2ae65d7fb8"). InnerVolumeSpecName "kube-api-access-xtlkj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:20:44 crc kubenswrapper[5116]: I0322 00:20:44.014784 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xtlkj\" (UniqueName: \"kubernetes.io/projected/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-kube-api-access-xtlkj\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:44 crc kubenswrapper[5116]: I0322 00:20:44.014850 5116 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-bundle\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:44 crc kubenswrapper[5116]: I0322 00:20:44.014866 5116 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-util\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:44 crc kubenswrapper[5116]: I0322 00:20:44.401709 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" Mar 22 00:20:44 crc kubenswrapper[5116]: I0322 00:20:44.401751 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" event={"ID":"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8","Type":"ContainerDied","Data":"306e4f01729ad8bbfcae9ed407b8f8ebfb81cfe346d54b8c7c4121e0ad254dc8"} Mar 22 00:20:44 crc kubenswrapper[5116]: I0322 00:20:44.401781 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="306e4f01729ad8bbfcae9ed407b8f8ebfb81cfe346d54b8c7c4121e0ad254dc8" Mar 22 00:20:44 crc kubenswrapper[5116]: I0322 00:20:44.404822 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-58c4bc569-nwp4h" event={"ID":"d6bec193-8107-440f-89aa-944885708496","Type":"ContainerStarted","Data":"0636ada234767a36a9cf7f3c546857611e34e9c4bd551b77148da4ff25e24b74"} Mar 22 00:20:44 crc kubenswrapper[5116]: I0322 00:20:44.428817 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elastic-operator-58c4bc569-nwp4h" podStartSLOduration=15.099320026000001 podStartE2EDuration="20.428786694s" podCreationTimestamp="2026-03-22 00:20:24 +0000 UTC" firstStartedPulling="2026-03-22 00:20:38.457681085 +0000 UTC m=+709.479982458" lastFinishedPulling="2026-03-22 00:20:43.787147753 +0000 UTC m=+714.809449126" observedRunningTime="2026-03-22 00:20:44.425882761 +0000 UTC m=+715.448184134" watchObservedRunningTime="2026-03-22 00:20:44.428786694 +0000 UTC m=+715.451088067" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.225921 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.227056 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e8a1901-425e-4555-a4f0-fd2ae65d7fb8" containerName="extract" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.227077 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e8a1901-425e-4555-a4f0-fd2ae65d7fb8" containerName="extract" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.227101 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e8a1901-425e-4555-a4f0-fd2ae65d7fb8" containerName="util" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.227108 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e8a1901-425e-4555-a4f0-fd2ae65d7fb8" containerName="util" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.227147 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e8a1901-425e-4555-a4f0-fd2ae65d7fb8" containerName="pull" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.227157 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e8a1901-425e-4555-a4f0-fd2ae65d7fb8" containerName="pull" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.227291 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="7e8a1901-425e-4555-a4f0-fd2ae65d7fb8" containerName="extract" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.368237 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.368415 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.371594 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"elasticsearch-es-remote-ca\"" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.373848 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"elasticsearch-es-default-es-config\"" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.374043 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"elasticsearch-es-default-es-transport-certs\"" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.374660 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"default-dockercfg-crp6c\"" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.374797 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"elasticsearch-es-scripts\"" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.375038 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"elasticsearch-es-http-certs-internal\"" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.375183 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"elasticsearch-es-unicast-hosts\"" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.375321 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"elasticsearch-es-xpack-file-realm\"" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.378367 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"elasticsearch-es-internal-users\"" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.542438 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.542550 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.542707 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.542922 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.542971 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.543086 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.543154 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.543338 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.543396 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.543478 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.543587 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.543613 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.543635 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.543683 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.543710 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.644510 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.644567 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.644607 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.644633 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.644675 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.644708 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.644735 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.644753 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.645261 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.645354 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.645410 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.645663 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.645795 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.645849 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.646071 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.646088 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.646212 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.646120 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.646272 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.646309 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.646469 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.646489 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.647951 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.653204 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.657216 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.657271 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.657904 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.657924 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.659462 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.663352 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.695855 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:46 crc kubenswrapper[5116]: I0322 00:20:46.676725 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" podUID="36ff6a0d-ec37-48dd-9e2b-01bcb5755738" containerName="registry" containerID="cri-o://c568767e3e8c6d00f1721bf03851b9ec54aeb271e4894892ac0cc16a9a33722c" gracePeriod=30 Mar 22 00:20:47 crc kubenswrapper[5116]: I0322 00:20:47.441192 5116 generic.go:358] "Generic (PLEG): container finished" podID="36ff6a0d-ec37-48dd-9e2b-01bcb5755738" containerID="c568767e3e8c6d00f1721bf03851b9ec54aeb271e4894892ac0cc16a9a33722c" exitCode=0 Mar 22 00:20:47 crc kubenswrapper[5116]: I0322 00:20:47.441299 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" event={"ID":"36ff6a0d-ec37-48dd-9e2b-01bcb5755738","Type":"ContainerDied","Data":"c568767e3e8c6d00f1721bf03851b9ec54aeb271e4894892ac0cc16a9a33722c"} Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.107131 5116 patch_prober.go:28] interesting pod/image-registry-66587d64c8-zwkhp container/registry namespace/openshift-image-registry: Readiness probe status=failure output="Get \"https://10.217.0.16:5000/healthz\": dial tcp 10.217.0.16:5000: connect: connection refused" start-of-body= Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.107503 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" podUID="36ff6a0d-ec37-48dd-9e2b-01bcb5755738" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.16:5000/healthz\": dial tcp 10.217.0.16:5000: connect: connection refused" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.139478 5116 scope.go:117] "RemoveContainer" containerID="2acebccbc85d9eff1c121aca735947ed6d77f0c1bc6b89aca01a5fc1d6de9f77" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.374196 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.444570 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.478792 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" event={"ID":"36ff6a0d-ec37-48dd-9e2b-01bcb5755738","Type":"ContainerDied","Data":"bca4de0caba9859b14c3f0eb17a3776e71425e24f9833420070510404cc3406c"} Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.478869 5116 scope.go:117] "RemoveContainer" containerID="c568767e3e8c6d00f1721bf03851b9ec54aeb271e4894892ac0cc16a9a33722c" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.479091 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.530755 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hl2cn\" (UniqueName: \"kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-kube-api-access-hl2cn\") pod \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.531189 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-registry-certificates\") pod \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.531559 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-ca-trust-extracted\") pod \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.531729 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-trusted-ca\") pod \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.531855 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-bound-sa-token\") pod \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.532090 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.532142 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-registry-tls\") pod \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.532208 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-installation-pull-secrets\") pod \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.532803 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "36ff6a0d-ec37-48dd-9e2b-01bcb5755738" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.533036 5116 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.537744 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "36ff6a0d-ec37-48dd-9e2b-01bcb5755738" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.544698 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "36ff6a0d-ec37-48dd-9e2b-01bcb5755738" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.547498 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "36ff6a0d-ec37-48dd-9e2b-01bcb5755738" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.549233 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (OuterVolumeSpecName: "registry-storage") pod "36ff6a0d-ec37-48dd-9e2b-01bcb5755738" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738"). InnerVolumeSpecName "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2". PluginName "kubernetes.io/csi", VolumeGIDValue "" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.552638 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "36ff6a0d-ec37-48dd-9e2b-01bcb5755738" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.554926 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "36ff6a0d-ec37-48dd-9e2b-01bcb5755738" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.563211 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-kube-api-access-hl2cn" (OuterVolumeSpecName: "kube-api-access-hl2cn") pod "36ff6a0d-ec37-48dd-9e2b-01bcb5755738" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738"). InnerVolumeSpecName "kube-api-access-hl2cn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.613356 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.633805 5116 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.633833 5116 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.633864 5116 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.633877 5116 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.633891 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hl2cn\" (UniqueName: \"kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-kube-api-access-hl2cn\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.633900 5116 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.811735 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-66587d64c8-zwkhp"] Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.816802 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-66587d64c8-zwkhp"] Mar 22 00:20:51 crc kubenswrapper[5116]: I0322 00:20:51.496161 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5","Type":"ContainerStarted","Data":"b5fa4ee0dfec28e69116324034f41cf34a2fef085e700e1eaca593926850691b"} Mar 22 00:20:51 crc kubenswrapper[5116]: I0322 00:20:51.504109 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-78b9bd8798-xdlcm" event={"ID":"73441892-3e06-43b0-bb99-44e4ff5f74b9","Type":"ContainerStarted","Data":"6fcfee34878c0e0e032a6302107a8d64e0027a2ff9e4729fdce9f93c8ade0b61"} Mar 22 00:20:51 crc kubenswrapper[5116]: I0322 00:20:51.518702 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/interconnect-operator-78b9bd8798-xdlcm" podStartSLOduration=13.673626277 podStartE2EDuration="25.518684564s" podCreationTimestamp="2026-03-22 00:20:26 +0000 UTC" firstStartedPulling="2026-03-22 00:20:38.48575944 +0000 UTC m=+709.508060813" lastFinishedPulling="2026-03-22 00:20:50.330817717 +0000 UTC m=+721.353119100" observedRunningTime="2026-03-22 00:20:51.517552107 +0000 UTC m=+722.539853480" watchObservedRunningTime="2026-03-22 00:20:51.518684564 +0000 UTC m=+722.540985937" Mar 22 00:20:51 crc kubenswrapper[5116]: I0322 00:20:51.709874 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36ff6a0d-ec37-48dd-9e2b-01bcb5755738" path="/var/lib/kubelet/pods/36ff6a0d-ec37-48dd-9e2b-01bcb5755738/volumes" Mar 22 00:20:56 crc kubenswrapper[5116]: I0322 00:20:56.687476 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-smk8l"] Mar 22 00:20:56 crc kubenswrapper[5116]: I0322 00:20:56.689643 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36ff6a0d-ec37-48dd-9e2b-01bcb5755738" containerName="registry" Mar 22 00:20:56 crc kubenswrapper[5116]: I0322 00:20:56.689661 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="36ff6a0d-ec37-48dd-9e2b-01bcb5755738" containerName="registry" Mar 22 00:20:56 crc kubenswrapper[5116]: I0322 00:20:56.689801 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="36ff6a0d-ec37-48dd-9e2b-01bcb5755738" containerName="registry" Mar 22 00:20:56 crc kubenswrapper[5116]: I0322 00:20:56.751994 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-smk8l"] Mar 22 00:20:56 crc kubenswrapper[5116]: I0322 00:20:56.752105 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-smk8l" Mar 22 00:20:56 crc kubenswrapper[5116]: I0322 00:20:56.756958 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Mar 22 00:20:56 crc kubenswrapper[5116]: I0322 00:20:56.757310 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-z8t8s\"" Mar 22 00:20:56 crc kubenswrapper[5116]: I0322 00:20:56.757497 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:20:56 crc kubenswrapper[5116]: I0322 00:20:56.835183 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjg4k\" (UniqueName: \"kubernetes.io/projected/8aad28fd-d043-497c-bd68-7d3515fd76cf-kube-api-access-bjg4k\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-smk8l\" (UID: \"8aad28fd-d043-497c-bd68-7d3515fd76cf\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-smk8l" Mar 22 00:20:56 crc kubenswrapper[5116]: I0322 00:20:56.835356 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8aad28fd-d043-497c-bd68-7d3515fd76cf-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-smk8l\" (UID: \"8aad28fd-d043-497c-bd68-7d3515fd76cf\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-smk8l" Mar 22 00:20:56 crc kubenswrapper[5116]: I0322 00:20:56.936555 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjg4k\" (UniqueName: \"kubernetes.io/projected/8aad28fd-d043-497c-bd68-7d3515fd76cf-kube-api-access-bjg4k\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-smk8l\" (UID: \"8aad28fd-d043-497c-bd68-7d3515fd76cf\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-smk8l" Mar 22 00:20:56 crc kubenswrapper[5116]: I0322 00:20:56.936684 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8aad28fd-d043-497c-bd68-7d3515fd76cf-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-smk8l\" (UID: \"8aad28fd-d043-497c-bd68-7d3515fd76cf\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-smk8l" Mar 22 00:20:56 crc kubenswrapper[5116]: I0322 00:20:56.937333 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8aad28fd-d043-497c-bd68-7d3515fd76cf-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-smk8l\" (UID: \"8aad28fd-d043-497c-bd68-7d3515fd76cf\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-smk8l" Mar 22 00:20:56 crc kubenswrapper[5116]: I0322 00:20:56.972314 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjg4k\" (UniqueName: \"kubernetes.io/projected/8aad28fd-d043-497c-bd68-7d3515fd76cf-kube-api-access-bjg4k\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-smk8l\" (UID: \"8aad28fd-d043-497c-bd68-7d3515fd76cf\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-smk8l" Mar 22 00:20:57 crc kubenswrapper[5116]: I0322 00:20:57.069590 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-smk8l" Mar 22 00:20:59 crc kubenswrapper[5116]: I0322 00:20:59.306193 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-smk8l"] Mar 22 00:20:59 crc kubenswrapper[5116]: W0322 00:20:59.322142 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8aad28fd_d043_497c_bd68_7d3515fd76cf.slice/crio-670533228bdfe4c62fea7ae72467e1c84e554e057c1c92fb652349b1cb818296 WatchSource:0}: Error finding container 670533228bdfe4c62fea7ae72467e1c84e554e057c1c92fb652349b1cb818296: Status 404 returned error can't find the container with id 670533228bdfe4c62fea7ae72467e1c84e554e057c1c92fb652349b1cb818296 Mar 22 00:20:59 crc kubenswrapper[5116]: I0322 00:20:59.578097 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-smk8l" event={"ID":"8aad28fd-d043-497c-bd68-7d3515fd76cf","Type":"ContainerStarted","Data":"670533228bdfe4c62fea7ae72467e1c84e554e057c1c92fb652349b1cb818296"} Mar 22 00:21:14 crc kubenswrapper[5116]: I0322 00:21:14.698494 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-smk8l" event={"ID":"8aad28fd-d043-497c-bd68-7d3515fd76cf","Type":"ContainerStarted","Data":"b7e243cfb52fe05a829b147c433706c9d55a2f73de1fe99f38444f54f4b18ed0"} Mar 22 00:21:14 crc kubenswrapper[5116]: I0322 00:21:14.701060 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5","Type":"ContainerStarted","Data":"6d24febd3fe6a1d1435e03e9cf028a5983f2b9b3281aa427ae12f63a2758bb7b"} Mar 22 00:21:14 crc kubenswrapper[5116]: I0322 00:21:14.724237 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-smk8l" podStartSLOduration=3.845346365 podStartE2EDuration="18.724212517s" podCreationTimestamp="2026-03-22 00:20:56 +0000 UTC" firstStartedPulling="2026-03-22 00:20:59.326131261 +0000 UTC m=+730.348432634" lastFinishedPulling="2026-03-22 00:21:14.204997413 +0000 UTC m=+745.227298786" observedRunningTime="2026-03-22 00:21:14.721095437 +0000 UTC m=+745.743396840" watchObservedRunningTime="2026-03-22 00:21:14.724212517 +0000 UTC m=+745.746513890" Mar 22 00:21:14 crc kubenswrapper[5116]: I0322 00:21:14.931831 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 22 00:21:14 crc kubenswrapper[5116]: I0322 00:21:14.968697 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 22 00:21:16 crc kubenswrapper[5116]: I0322 00:21:16.714937 5116 generic.go:358] "Generic (PLEG): container finished" podID="ccb103ab-2a74-44b8-b853-0da2e0b4a6b5" containerID="6d24febd3fe6a1d1435e03e9cf028a5983f2b9b3281aa427ae12f63a2758bb7b" exitCode=0 Mar 22 00:21:16 crc kubenswrapper[5116]: I0322 00:21:16.715421 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5","Type":"ContainerDied","Data":"6d24febd3fe6a1d1435e03e9cf028a5983f2b9b3281aa427ae12f63a2758bb7b"} Mar 22 00:21:17 crc kubenswrapper[5116]: I0322 00:21:17.723682 5116 generic.go:358] "Generic (PLEG): container finished" podID="ccb103ab-2a74-44b8-b853-0da2e0b4a6b5" containerID="3e1c524b1c5bc74c3c638ea86eeead40ea3a9c7c3a14e091a809e129a835dca8" exitCode=0 Mar 22 00:21:17 crc kubenswrapper[5116]: I0322 00:21:17.723796 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5","Type":"ContainerDied","Data":"3e1c524b1c5bc74c3c638ea86eeead40ea3a9c7c3a14e091a809e129a835dca8"} Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.353953 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-8tskc"] Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.358980 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-8tskc" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.361809 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.361829 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-n648w\"" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.362480 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.374505 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-8tskc"] Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.485486 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57a02b12-f3d8-4264-af22-4fb7bc40602f-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-8tskc\" (UID: \"57a02b12-f3d8-4264-af22-4fb7bc40602f\") " pod="cert-manager/cert-manager-webhook-597b96b99b-8tskc" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.485820 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46wr7\" (UniqueName: \"kubernetes.io/projected/57a02b12-f3d8-4264-af22-4fb7bc40602f-kube-api-access-46wr7\") pod \"cert-manager-webhook-597b96b99b-8tskc\" (UID: \"57a02b12-f3d8-4264-af22-4fb7bc40602f\") " pod="cert-manager/cert-manager-webhook-597b96b99b-8tskc" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.587511 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57a02b12-f3d8-4264-af22-4fb7bc40602f-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-8tskc\" (UID: \"57a02b12-f3d8-4264-af22-4fb7bc40602f\") " pod="cert-manager/cert-manager-webhook-597b96b99b-8tskc" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.587616 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46wr7\" (UniqueName: \"kubernetes.io/projected/57a02b12-f3d8-4264-af22-4fb7bc40602f-kube-api-access-46wr7\") pod \"cert-manager-webhook-597b96b99b-8tskc\" (UID: \"57a02b12-f3d8-4264-af22-4fb7bc40602f\") " pod="cert-manager/cert-manager-webhook-597b96b99b-8tskc" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.609940 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57a02b12-f3d8-4264-af22-4fb7bc40602f-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-8tskc\" (UID: \"57a02b12-f3d8-4264-af22-4fb7bc40602f\") " pod="cert-manager/cert-manager-webhook-597b96b99b-8tskc" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.611369 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-46wr7\" (UniqueName: \"kubernetes.io/projected/57a02b12-f3d8-4264-af22-4fb7bc40602f-kube-api-access-46wr7\") pod \"cert-manager-webhook-597b96b99b-8tskc\" (UID: \"57a02b12-f3d8-4264-af22-4fb7bc40602f\") " pod="cert-manager/cert-manager-webhook-597b96b99b-8tskc" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.625914 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.631955 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.634160 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"service-telemetry-operator-1-sys-config\"" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.634522 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"service-telemetry-operator-1-global-ca\"" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.634732 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"service-telemetry-operator-1-ca\"" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.635935 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"builder-dockercfg-qv5f4\"" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.645351 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.676429 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-8tskc" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.737372 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5","Type":"ContainerStarted","Data":"be87eb9395ad9052d5f304afc3379527d4242f64eab5335d22007c30a2e0f4c7"} Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.737527 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.772267 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elasticsearch-es-default-0" podStartSLOduration=10.061333539 podStartE2EDuration="33.772238912s" podCreationTimestamp="2026-03-22 00:20:45 +0000 UTC" firstStartedPulling="2026-03-22 00:20:50.630089333 +0000 UTC m=+721.652390706" lastFinishedPulling="2026-03-22 00:21:14.340994706 +0000 UTC m=+745.363296079" observedRunningTime="2026-03-22 00:21:18.766866929 +0000 UTC m=+749.789168302" watchObservedRunningTime="2026-03-22 00:21:18.772238912 +0000 UTC m=+749.794540295" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.789881 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.789940 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.789967 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/64d31296-d621-4ea6-9145-d61b41e2f2f8-builder-dockercfg-qv5f4-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.789997 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/64d31296-d621-4ea6-9145-d61b41e2f2f8-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.790030 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.790048 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/64d31296-d621-4ea6-9145-d61b41e2f2f8-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.790066 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.790092 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/64d31296-d621-4ea6-9145-d61b41e2f2f8-builder-dockercfg-qv5f4-push\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.790130 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.790185 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.790220 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.790248 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcqmz\" (UniqueName: \"kubernetes.io/projected/64d31296-d621-4ea6-9145-d61b41e2f2f8-kube-api-access-hcqmz\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.892344 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.892920 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/64d31296-d621-4ea6-9145-d61b41e2f2f8-builder-dockercfg-qv5f4-push\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.892978 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.893036 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.893070 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.893089 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hcqmz\" (UniqueName: \"kubernetes.io/projected/64d31296-d621-4ea6-9145-d61b41e2f2f8-kube-api-access-hcqmz\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.893148 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.893204 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.893230 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/64d31296-d621-4ea6-9145-d61b41e2f2f8-builder-dockercfg-qv5f4-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.893263 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/64d31296-d621-4ea6-9145-d61b41e2f2f8-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.893327 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.893351 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/64d31296-d621-4ea6-9145-d61b41e2f2f8-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.893453 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/64d31296-d621-4ea6-9145-d61b41e2f2f8-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.893495 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.894396 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/64d31296-d621-4ea6-9145-d61b41e2f2f8-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.894410 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.894824 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.895206 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.895961 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.896378 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.896483 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.904281 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/64d31296-d621-4ea6-9145-d61b41e2f2f8-builder-dockercfg-qv5f4-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.905116 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/64d31296-d621-4ea6-9145-d61b41e2f2f8-builder-dockercfg-qv5f4-push\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.917896 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcqmz\" (UniqueName: \"kubernetes.io/projected/64d31296-d621-4ea6-9145-d61b41e2f2f8-kube-api-access-hcqmz\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.962947 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.971409 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-8tskc"] Mar 22 00:21:19 crc kubenswrapper[5116]: I0322 00:21:19.400818 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 22 00:21:19 crc kubenswrapper[5116]: W0322 00:21:19.410545 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64d31296_d621_4ea6_9145_d61b41e2f2f8.slice/crio-1ed3d96a6af3d62515a9799736475729996fdf1fec128d2904bdadd2196b09d8 WatchSource:0}: Error finding container 1ed3d96a6af3d62515a9799736475729996fdf1fec128d2904bdadd2196b09d8: Status 404 returned error can't find the container with id 1ed3d96a6af3d62515a9799736475729996fdf1fec128d2904bdadd2196b09d8 Mar 22 00:21:19 crc kubenswrapper[5116]: I0322 00:21:19.547913 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-mnbt9"] Mar 22 00:21:19 crc kubenswrapper[5116]: I0322 00:21:19.553080 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-mnbt9" Mar 22 00:21:19 crc kubenswrapper[5116]: I0322 00:21:19.556079 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-h5hnt\"" Mar 22 00:21:19 crc kubenswrapper[5116]: I0322 00:21:19.557912 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-mnbt9"] Mar 22 00:21:19 crc kubenswrapper[5116]: I0322 00:21:19.704030 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cee1607e-0fc4-4dfb-bcbb-a2b8d9636b69-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-mnbt9\" (UID: \"cee1607e-0fc4-4dfb-bcbb-a2b8d9636b69\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-mnbt9" Mar 22 00:21:19 crc kubenswrapper[5116]: I0322 00:21:19.704598 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqxdb\" (UniqueName: \"kubernetes.io/projected/cee1607e-0fc4-4dfb-bcbb-a2b8d9636b69-kube-api-access-kqxdb\") pod \"cert-manager-cainjector-8966b78d4-mnbt9\" (UID: \"cee1607e-0fc4-4dfb-bcbb-a2b8d9636b69\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-mnbt9" Mar 22 00:21:19 crc kubenswrapper[5116]: I0322 00:21:19.760507 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"64d31296-d621-4ea6-9145-d61b41e2f2f8","Type":"ContainerStarted","Data":"1ed3d96a6af3d62515a9799736475729996fdf1fec128d2904bdadd2196b09d8"} Mar 22 00:21:19 crc kubenswrapper[5116]: I0322 00:21:19.763238 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-8tskc" event={"ID":"57a02b12-f3d8-4264-af22-4fb7bc40602f","Type":"ContainerStarted","Data":"b34638f3e93a0bacb5347bb40ec5c785a71a63c33097eebac668cd34743cbfd8"} Mar 22 00:21:19 crc kubenswrapper[5116]: I0322 00:21:19.807133 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cee1607e-0fc4-4dfb-bcbb-a2b8d9636b69-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-mnbt9\" (UID: \"cee1607e-0fc4-4dfb-bcbb-a2b8d9636b69\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-mnbt9" Mar 22 00:21:19 crc kubenswrapper[5116]: I0322 00:21:19.807923 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqxdb\" (UniqueName: \"kubernetes.io/projected/cee1607e-0fc4-4dfb-bcbb-a2b8d9636b69-kube-api-access-kqxdb\") pod \"cert-manager-cainjector-8966b78d4-mnbt9\" (UID: \"cee1607e-0fc4-4dfb-bcbb-a2b8d9636b69\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-mnbt9" Mar 22 00:21:19 crc kubenswrapper[5116]: I0322 00:21:19.841808 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cee1607e-0fc4-4dfb-bcbb-a2b8d9636b69-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-mnbt9\" (UID: \"cee1607e-0fc4-4dfb-bcbb-a2b8d9636b69\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-mnbt9" Mar 22 00:21:19 crc kubenswrapper[5116]: I0322 00:21:19.842564 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqxdb\" (UniqueName: \"kubernetes.io/projected/cee1607e-0fc4-4dfb-bcbb-a2b8d9636b69-kube-api-access-kqxdb\") pod \"cert-manager-cainjector-8966b78d4-mnbt9\" (UID: \"cee1607e-0fc4-4dfb-bcbb-a2b8d9636b69\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-mnbt9" Mar 22 00:21:19 crc kubenswrapper[5116]: I0322 00:21:19.871546 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-mnbt9" Mar 22 00:21:20 crc kubenswrapper[5116]: I0322 00:21:20.104453 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-mnbt9"] Mar 22 00:21:20 crc kubenswrapper[5116]: I0322 00:21:20.779274 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-mnbt9" event={"ID":"cee1607e-0fc4-4dfb-bcbb-a2b8d9636b69","Type":"ContainerStarted","Data":"06d769a2727f985e1eb4f486ba4dbd4f8b0b1367735c8fcef2d93b288e543534"} Mar 22 00:21:23 crc kubenswrapper[5116]: I0322 00:21:23.057431 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:21:23 crc kubenswrapper[5116]: I0322 00:21:23.058027 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:21:28 crc kubenswrapper[5116]: I0322 00:21:28.650188 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 22 00:21:28 crc kubenswrapper[5116]: I0322 00:21:28.858009 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-8tskc" event={"ID":"57a02b12-f3d8-4264-af22-4fb7bc40602f","Type":"ContainerStarted","Data":"aceab5083cda3271b1f244d2f653dc7cb9e730004ed36a2c861faf10dd96361b"} Mar 22 00:21:28 crc kubenswrapper[5116]: I0322 00:21:28.858153 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-8tskc" Mar 22 00:21:28 crc kubenswrapper[5116]: I0322 00:21:28.860970 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-mnbt9" event={"ID":"cee1607e-0fc4-4dfb-bcbb-a2b8d9636b69","Type":"ContainerStarted","Data":"63e33c9dcfd4aeaabf58f24891e433e39748f03efbaa7b9bada0d7cde549e657"} Mar 22 00:21:28 crc kubenswrapper[5116]: I0322 00:21:28.863096 5116 generic.go:358] "Generic (PLEG): container finished" podID="64d31296-d621-4ea6-9145-d61b41e2f2f8" containerID="f3b30650b7f17897ba5577d414b6e7254548d88f736d4402f1743debfd49c680" exitCode=0 Mar 22 00:21:28 crc kubenswrapper[5116]: I0322 00:21:28.863220 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"64d31296-d621-4ea6-9145-d61b41e2f2f8","Type":"ContainerDied","Data":"f3b30650b7f17897ba5577d414b6e7254548d88f736d4402f1743debfd49c680"} Mar 22 00:21:28 crc kubenswrapper[5116]: I0322 00:21:28.885857 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-8tskc" podStartSLOduration=2.220774982 podStartE2EDuration="10.885826953s" podCreationTimestamp="2026-03-22 00:21:18 +0000 UTC" firstStartedPulling="2026-03-22 00:21:19.022878791 +0000 UTC m=+750.045180164" lastFinishedPulling="2026-03-22 00:21:27.687930762 +0000 UTC m=+758.710232135" observedRunningTime="2026-03-22 00:21:28.879253191 +0000 UTC m=+759.901554624" watchObservedRunningTime="2026-03-22 00:21:28.885826953 +0000 UTC m=+759.908128366" Mar 22 00:21:28 crc kubenswrapper[5116]: I0322 00:21:28.941126 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-mnbt9" podStartSLOduration=2.385827493 podStartE2EDuration="9.941095034s" podCreationTimestamp="2026-03-22 00:21:19 +0000 UTC" firstStartedPulling="2026-03-22 00:21:20.126392919 +0000 UTC m=+751.148694292" lastFinishedPulling="2026-03-22 00:21:27.68166046 +0000 UTC m=+758.703961833" observedRunningTime="2026-03-22 00:21:28.936766754 +0000 UTC m=+759.959068137" watchObservedRunningTime="2026-03-22 00:21:28.941095034 +0000 UTC m=+759.963396407" Mar 22 00:21:29 crc kubenswrapper[5116]: I0322 00:21:29.840814 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="ccb103ab-2a74-44b8-b853-0da2e0b4a6b5" containerName="elasticsearch" probeResult="failure" output=< Mar 22 00:21:29 crc kubenswrapper[5116]: {"timestamp": "2026-03-22T00:21:29+00:00", "message": "readiness probe failed", "curl_rc": "7"} Mar 22 00:21:29 crc kubenswrapper[5116]: > Mar 22 00:21:29 crc kubenswrapper[5116]: I0322 00:21:29.873924 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"64d31296-d621-4ea6-9145-d61b41e2f2f8","Type":"ContainerStarted","Data":"0eeebb3d3c244e878a2e5436278966e18e43bd8b1746ed0cac55d3a481273273"} Mar 22 00:21:29 crc kubenswrapper[5116]: I0322 00:21:29.874097 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-1-build" podUID="64d31296-d621-4ea6-9145-d61b41e2f2f8" containerName="docker-build" containerID="cri-o://0eeebb3d3c244e878a2e5436278966e18e43bd8b1746ed0cac55d3a481273273" gracePeriod=30 Mar 22 00:21:29 crc kubenswrapper[5116]: I0322 00:21:29.907509 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-1-build" podStartSLOduration=3.625996164 podStartE2EDuration="11.907479542s" podCreationTimestamp="2026-03-22 00:21:18 +0000 UTC" firstStartedPulling="2026-03-22 00:21:19.41436989 +0000 UTC m=+750.436671263" lastFinishedPulling="2026-03-22 00:21:27.695853268 +0000 UTC m=+758.718154641" observedRunningTime="2026-03-22 00:21:29.896955312 +0000 UTC m=+760.919256685" watchObservedRunningTime="2026-03-22 00:21:29.907479542 +0000 UTC m=+760.929780915" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.314997 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.634644 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.634823 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.637554 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"service-telemetry-operator-2-global-ca\"" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.637676 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"service-telemetry-operator-2-sys-config\"" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.640079 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"service-telemetry-operator-2-ca\"" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.682246 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.682328 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b5b6453f-08b2-400a-8db6-b042778914e1-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.682408 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.682452 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.682480 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b5b6453f-08b2-400a-8db6-b042778914e1-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.682507 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.682523 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.682555 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.682577 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.682632 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c724g\" (UniqueName: \"kubernetes.io/projected/b5b6453f-08b2-400a-8db6-b042778914e1-kube-api-access-c724g\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.682659 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/b5b6453f-08b2-400a-8db6-b042778914e1-builder-dockercfg-qv5f4-push\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.682697 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/b5b6453f-08b2-400a-8db6-b042778914e1-builder-dockercfg-qv5f4-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.784184 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.784638 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b5b6453f-08b2-400a-8db6-b042778914e1-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.784785 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.784910 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.785033 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.785153 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.785344 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c724g\" (UniqueName: \"kubernetes.io/projected/b5b6453f-08b2-400a-8db6-b042778914e1-kube-api-access-c724g\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.785459 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/b5b6453f-08b2-400a-8db6-b042778914e1-builder-dockercfg-qv5f4-push\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.785587 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/b5b6453f-08b2-400a-8db6-b042778914e1-builder-dockercfg-qv5f4-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.785708 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.785818 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b5b6453f-08b2-400a-8db6-b042778914e1-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.785918 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.787447 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.787885 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.788146 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.788362 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.788479 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b5b6453f-08b2-400a-8db6-b042778914e1-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.788593 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.788742 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b5b6453f-08b2-400a-8db6-b042778914e1-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.788897 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.788936 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.796979 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/b5b6453f-08b2-400a-8db6-b042778914e1-builder-dockercfg-qv5f4-push\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.800264 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/b5b6453f-08b2-400a-8db6-b042778914e1-builder-dockercfg-qv5f4-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.807478 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c724g\" (UniqueName: \"kubernetes.io/projected/b5b6453f-08b2-400a-8db6-b042778914e1-kube-api-access-c724g\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.953801 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:34 crc kubenswrapper[5116]: I0322 00:21:34.792503 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 22 00:21:34 crc kubenswrapper[5116]: I0322 00:21:34.836377 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="ccb103ab-2a74-44b8-b853-0da2e0b4a6b5" containerName="elasticsearch" probeResult="failure" output=< Mar 22 00:21:34 crc kubenswrapper[5116]: {"timestamp": "2026-03-22T00:21:34+00:00", "message": "readiness probe failed", "curl_rc": "7"} Mar 22 00:21:34 crc kubenswrapper[5116]: > Mar 22 00:21:34 crc kubenswrapper[5116]: I0322 00:21:34.878201 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-8tskc" Mar 22 00:21:36 crc kubenswrapper[5116]: I0322 00:21:36.533542 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-nt75l"] Mar 22 00:21:36 crc kubenswrapper[5116]: I0322 00:21:36.707481 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-nt75l"] Mar 22 00:21:36 crc kubenswrapper[5116]: I0322 00:21:36.707604 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-nt75l" Mar 22 00:21:36 crc kubenswrapper[5116]: I0322 00:21:36.720594 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-85xrz\"" Mar 22 00:21:36 crc kubenswrapper[5116]: I0322 00:21:36.796778 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e6d082fa-fedb-4089-87be-6bd1f0922f14-bound-sa-token\") pod \"cert-manager-759f64656b-nt75l\" (UID: \"e6d082fa-fedb-4089-87be-6bd1f0922f14\") " pod="cert-manager/cert-manager-759f64656b-nt75l" Mar 22 00:21:36 crc kubenswrapper[5116]: I0322 00:21:36.797027 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxcdf\" (UniqueName: \"kubernetes.io/projected/e6d082fa-fedb-4089-87be-6bd1f0922f14-kube-api-access-gxcdf\") pod \"cert-manager-759f64656b-nt75l\" (UID: \"e6d082fa-fedb-4089-87be-6bd1f0922f14\") " pod="cert-manager/cert-manager-759f64656b-nt75l" Mar 22 00:21:36 crc kubenswrapper[5116]: I0322 00:21:36.901711 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e6d082fa-fedb-4089-87be-6bd1f0922f14-bound-sa-token\") pod \"cert-manager-759f64656b-nt75l\" (UID: \"e6d082fa-fedb-4089-87be-6bd1f0922f14\") " pod="cert-manager/cert-manager-759f64656b-nt75l" Mar 22 00:21:36 crc kubenswrapper[5116]: I0322 00:21:36.901785 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxcdf\" (UniqueName: \"kubernetes.io/projected/e6d082fa-fedb-4089-87be-6bd1f0922f14-kube-api-access-gxcdf\") pod \"cert-manager-759f64656b-nt75l\" (UID: \"e6d082fa-fedb-4089-87be-6bd1f0922f14\") " pod="cert-manager/cert-manager-759f64656b-nt75l" Mar 22 00:21:36 crc kubenswrapper[5116]: I0322 00:21:36.923350 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxcdf\" (UniqueName: \"kubernetes.io/projected/e6d082fa-fedb-4089-87be-6bd1f0922f14-kube-api-access-gxcdf\") pod \"cert-manager-759f64656b-nt75l\" (UID: \"e6d082fa-fedb-4089-87be-6bd1f0922f14\") " pod="cert-manager/cert-manager-759f64656b-nt75l" Mar 22 00:21:36 crc kubenswrapper[5116]: I0322 00:21:36.923614 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e6d082fa-fedb-4089-87be-6bd1f0922f14-bound-sa-token\") pod \"cert-manager-759f64656b-nt75l\" (UID: \"e6d082fa-fedb-4089-87be-6bd1f0922f14\") " pod="cert-manager/cert-manager-759f64656b-nt75l" Mar 22 00:21:36 crc kubenswrapper[5116]: I0322 00:21:36.996701 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_64d31296-d621-4ea6-9145-d61b41e2f2f8/docker-build/0.log" Mar 22 00:21:36 crc kubenswrapper[5116]: I0322 00:21:36.997217 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.025909 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-nt75l" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.105952 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-container-storage-root\") pod \"64d31296-d621-4ea6-9145-d61b41e2f2f8\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.106374 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-container-storage-run\") pod \"64d31296-d621-4ea6-9145-d61b41e2f2f8\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.106533 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-system-configs\") pod \"64d31296-d621-4ea6-9145-d61b41e2f2f8\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.107366 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "64d31296-d621-4ea6-9145-d61b41e2f2f8" (UID: "64d31296-d621-4ea6-9145-d61b41e2f2f8"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.107407 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-blob-cache\") pod \"64d31296-d621-4ea6-9145-d61b41e2f2f8\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.107479 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-proxy-ca-bundles\") pod \"64d31296-d621-4ea6-9145-d61b41e2f2f8\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.107596 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcqmz\" (UniqueName: \"kubernetes.io/projected/64d31296-d621-4ea6-9145-d61b41e2f2f8-kube-api-access-hcqmz\") pod \"64d31296-d621-4ea6-9145-d61b41e2f2f8\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.107651 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-ca-bundles\") pod \"64d31296-d621-4ea6-9145-d61b41e2f2f8\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.107696 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/64d31296-d621-4ea6-9145-d61b41e2f2f8-builder-dockercfg-qv5f4-push\") pod \"64d31296-d621-4ea6-9145-d61b41e2f2f8\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.107791 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-buildworkdir\") pod \"64d31296-d621-4ea6-9145-d61b41e2f2f8\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.107870 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/64d31296-d621-4ea6-9145-d61b41e2f2f8-builder-dockercfg-qv5f4-pull\") pod \"64d31296-d621-4ea6-9145-d61b41e2f2f8\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.107898 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/64d31296-d621-4ea6-9145-d61b41e2f2f8-node-pullsecrets\") pod \"64d31296-d621-4ea6-9145-d61b41e2f2f8\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.107925 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "64d31296-d621-4ea6-9145-d61b41e2f2f8" (UID: "64d31296-d621-4ea6-9145-d61b41e2f2f8"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.107966 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/64d31296-d621-4ea6-9145-d61b41e2f2f8-buildcachedir\") pod \"64d31296-d621-4ea6-9145-d61b41e2f2f8\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.107970 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "64d31296-d621-4ea6-9145-d61b41e2f2f8" (UID: "64d31296-d621-4ea6-9145-d61b41e2f2f8"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.108309 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "64d31296-d621-4ea6-9145-d61b41e2f2f8" (UID: "64d31296-d621-4ea6-9145-d61b41e2f2f8"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.108361 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64d31296-d621-4ea6-9145-d61b41e2f2f8-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "64d31296-d621-4ea6-9145-d61b41e2f2f8" (UID: "64d31296-d621-4ea6-9145-d61b41e2f2f8"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.108456 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64d31296-d621-4ea6-9145-d61b41e2f2f8-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "64d31296-d621-4ea6-9145-d61b41e2f2f8" (UID: "64d31296-d621-4ea6-9145-d61b41e2f2f8"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.108496 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "64d31296-d621-4ea6-9145-d61b41e2f2f8" (UID: "64d31296-d621-4ea6-9145-d61b41e2f2f8"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.108792 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.108814 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.108830 5116 reconciler_common.go:299] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.108842 5116 reconciler_common.go:299] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.108854 5116 reconciler_common.go:299] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.108867 5116 reconciler_common.go:299] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/64d31296-d621-4ea6-9145-d61b41e2f2f8-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.108879 5116 reconciler_common.go:299] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/64d31296-d621-4ea6-9145-d61b41e2f2f8-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.108891 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "64d31296-d621-4ea6-9145-d61b41e2f2f8" (UID: "64d31296-d621-4ea6-9145-d61b41e2f2f8"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.109180 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "64d31296-d621-4ea6-9145-d61b41e2f2f8" (UID: "64d31296-d621-4ea6-9145-d61b41e2f2f8"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.149971 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64d31296-d621-4ea6-9145-d61b41e2f2f8-kube-api-access-hcqmz" (OuterVolumeSpecName: "kube-api-access-hcqmz") pod "64d31296-d621-4ea6-9145-d61b41e2f2f8" (UID: "64d31296-d621-4ea6-9145-d61b41e2f2f8"). InnerVolumeSpecName "kube-api-access-hcqmz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.150147 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64d31296-d621-4ea6-9145-d61b41e2f2f8-builder-dockercfg-qv5f4-push" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-push") pod "64d31296-d621-4ea6-9145-d61b41e2f2f8" (UID: "64d31296-d621-4ea6-9145-d61b41e2f2f8"). InnerVolumeSpecName "builder-dockercfg-qv5f4-push". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.155362 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64d31296-d621-4ea6-9145-d61b41e2f2f8-builder-dockercfg-qv5f4-pull" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-pull") pod "64d31296-d621-4ea6-9145-d61b41e2f2f8" (UID: "64d31296-d621-4ea6-9145-d61b41e2f2f8"). InnerVolumeSpecName "builder-dockercfg-qv5f4-pull". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.211077 5116 reconciler_common.go:299] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.211137 5116 reconciler_common.go:299] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.211157 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hcqmz\" (UniqueName: \"kubernetes.io/projected/64d31296-d621-4ea6-9145-d61b41e2f2f8-kube-api-access-hcqmz\") on node \"crc\" DevicePath \"\"" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.211189 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/64d31296-d621-4ea6-9145-d61b41e2f2f8-builder-dockercfg-qv5f4-push\") on node \"crc\" DevicePath \"\"" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.211201 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/64d31296-d621-4ea6-9145-d61b41e2f2f8-builder-dockercfg-qv5f4-pull\") on node \"crc\" DevicePath \"\"" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.483094 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-nt75l"] Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.643004 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_64d31296-d621-4ea6-9145-d61b41e2f2f8/docker-build/0.log" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.644239 5116 generic.go:358] "Generic (PLEG): container finished" podID="64d31296-d621-4ea6-9145-d61b41e2f2f8" containerID="0eeebb3d3c244e878a2e5436278966e18e43bd8b1746ed0cac55d3a481273273" exitCode=1 Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.644785 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"64d31296-d621-4ea6-9145-d61b41e2f2f8","Type":"ContainerDied","Data":"0eeebb3d3c244e878a2e5436278966e18e43bd8b1746ed0cac55d3a481273273"} Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.644845 5116 scope.go:117] "RemoveContainer" containerID="0eeebb3d3c244e878a2e5436278966e18e43bd8b1746ed0cac55d3a481273273" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.663317 5116 scope.go:117] "RemoveContainer" containerID="f3b30650b7f17897ba5577d414b6e7254548d88f736d4402f1743debfd49c680" Mar 22 00:21:38 crc kubenswrapper[5116]: I0322 00:21:38.653710 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:38 crc kubenswrapper[5116]: I0322 00:21:38.653704 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"64d31296-d621-4ea6-9145-d61b41e2f2f8","Type":"ContainerDied","Data":"1ed3d96a6af3d62515a9799736475729996fdf1fec128d2904bdadd2196b09d8"} Mar 22 00:21:38 crc kubenswrapper[5116]: I0322 00:21:38.655705 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-nt75l" event={"ID":"e6d082fa-fedb-4089-87be-6bd1f0922f14","Type":"ContainerStarted","Data":"a2af18dfca18a5f1b09b814bf89f0366b28a504f7958cd3c8152c3bbb1548baf"} Mar 22 00:21:38 crc kubenswrapper[5116]: I0322 00:21:38.655751 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-nt75l" event={"ID":"e6d082fa-fedb-4089-87be-6bd1f0922f14","Type":"ContainerStarted","Data":"260824329a312836d7a1180ec769c75537495b4fe0141a96f98865436688e964"} Mar 22 00:21:38 crc kubenswrapper[5116]: I0322 00:21:38.657360 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"b5b6453f-08b2-400a-8db6-b042778914e1","Type":"ContainerStarted","Data":"fdd7681f7def8181a3d18d083702a111b549b2b54f9a660df6b4b6ed1754f9b6"} Mar 22 00:21:38 crc kubenswrapper[5116]: I0322 00:21:38.657430 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"b5b6453f-08b2-400a-8db6-b042778914e1","Type":"ContainerStarted","Data":"4bff84bbfff058bf2ecd5285af8c9b30ed4caa49bd5b3b83f0e15af519300215"} Mar 22 00:21:38 crc kubenswrapper[5116]: I0322 00:21:38.682844 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-nt75l" podStartSLOduration=2.682817826 podStartE2EDuration="2.682817826s" podCreationTimestamp="2026-03-22 00:21:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:21:38.680025356 +0000 UTC m=+769.702326769" watchObservedRunningTime="2026-03-22 00:21:38.682817826 +0000 UTC m=+769.705119209" Mar 22 00:21:38 crc kubenswrapper[5116]: I0322 00:21:38.706456 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 22 00:21:38 crc kubenswrapper[5116]: I0322 00:21:38.717899 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 22 00:21:39 crc kubenswrapper[5116]: I0322 00:21:39.707201 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64d31296-d621-4ea6-9145-d61b41e2f2f8" path="/var/lib/kubelet/pods/64d31296-d621-4ea6-9145-d61b41e2f2f8/volumes" Mar 22 00:21:40 crc kubenswrapper[5116]: I0322 00:21:40.555198 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:21:47 crc kubenswrapper[5116]: I0322 00:21:47.740022 5116 generic.go:358] "Generic (PLEG): container finished" podID="b5b6453f-08b2-400a-8db6-b042778914e1" containerID="fdd7681f7def8181a3d18d083702a111b549b2b54f9a660df6b4b6ed1754f9b6" exitCode=0 Mar 22 00:21:47 crc kubenswrapper[5116]: I0322 00:21:47.740202 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"b5b6453f-08b2-400a-8db6-b042778914e1","Type":"ContainerDied","Data":"fdd7681f7def8181a3d18d083702a111b549b2b54f9a660df6b4b6ed1754f9b6"} Mar 22 00:21:48 crc kubenswrapper[5116]: I0322 00:21:48.753406 5116 generic.go:358] "Generic (PLEG): container finished" podID="b5b6453f-08b2-400a-8db6-b042778914e1" containerID="aebfeeceb4efdd1c94ed085dd3b8ce981a1f74150b4b0a287a7cc662eb1c1989" exitCode=0 Mar 22 00:21:48 crc kubenswrapper[5116]: I0322 00:21:48.753583 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"b5b6453f-08b2-400a-8db6-b042778914e1","Type":"ContainerDied","Data":"aebfeeceb4efdd1c94ed085dd3b8ce981a1f74150b4b0a287a7cc662eb1c1989"} Mar 22 00:21:48 crc kubenswrapper[5116]: I0322 00:21:48.796299 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-2-build_b5b6453f-08b2-400a-8db6-b042778914e1/manage-dockerfile/0.log" Mar 22 00:21:49 crc kubenswrapper[5116]: I0322 00:21:49.769593 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"b5b6453f-08b2-400a-8db6-b042778914e1","Type":"ContainerStarted","Data":"9f80dd160f49863da97b4679eccf90a67796405074ac221d7cb0f7d3175f7f14"} Mar 22 00:21:49 crc kubenswrapper[5116]: I0322 00:21:49.802026 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-2-build" podStartSLOduration=19.802006586 podStartE2EDuration="19.802006586s" podCreationTimestamp="2026-03-22 00:21:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:21:49.79529372 +0000 UTC m=+780.817595103" watchObservedRunningTime="2026-03-22 00:21:49.802006586 +0000 UTC m=+780.824307979" Mar 22 00:21:53 crc kubenswrapper[5116]: I0322 00:21:53.056948 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:21:53 crc kubenswrapper[5116]: I0322 00:21:53.057088 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:22:00 crc kubenswrapper[5116]: I0322 00:22:00.149343 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29568982-7k9wz"] Mar 22 00:22:00 crc kubenswrapper[5116]: I0322 00:22:00.150614 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="64d31296-d621-4ea6-9145-d61b41e2f2f8" containerName="docker-build" Mar 22 00:22:00 crc kubenswrapper[5116]: I0322 00:22:00.150629 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="64d31296-d621-4ea6-9145-d61b41e2f2f8" containerName="docker-build" Mar 22 00:22:00 crc kubenswrapper[5116]: I0322 00:22:00.150661 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="64d31296-d621-4ea6-9145-d61b41e2f2f8" containerName="manage-dockerfile" Mar 22 00:22:00 crc kubenswrapper[5116]: I0322 00:22:00.150671 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="64d31296-d621-4ea6-9145-d61b41e2f2f8" containerName="manage-dockerfile" Mar 22 00:22:00 crc kubenswrapper[5116]: I0322 00:22:00.150800 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="64d31296-d621-4ea6-9145-d61b41e2f2f8" containerName="docker-build" Mar 22 00:22:00 crc kubenswrapper[5116]: I0322 00:22:00.323495 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568982-7k9wz"] Mar 22 00:22:00 crc kubenswrapper[5116]: I0322 00:22:00.323565 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568982-7k9wz" Mar 22 00:22:00 crc kubenswrapper[5116]: I0322 00:22:00.326076 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-zsw2q\"" Mar 22 00:22:00 crc kubenswrapper[5116]: I0322 00:22:00.326637 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 22 00:22:00 crc kubenswrapper[5116]: I0322 00:22:00.327324 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 22 00:22:00 crc kubenswrapper[5116]: I0322 00:22:00.471456 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwpfq\" (UniqueName: \"kubernetes.io/projected/26bdb492-c2c3-48e7-b86a-b83cb2f4aea5-kube-api-access-zwpfq\") pod \"auto-csr-approver-29568982-7k9wz\" (UID: \"26bdb492-c2c3-48e7-b86a-b83cb2f4aea5\") " pod="openshift-infra/auto-csr-approver-29568982-7k9wz" Mar 22 00:22:00 crc kubenswrapper[5116]: I0322 00:22:00.572523 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zwpfq\" (UniqueName: \"kubernetes.io/projected/26bdb492-c2c3-48e7-b86a-b83cb2f4aea5-kube-api-access-zwpfq\") pod \"auto-csr-approver-29568982-7k9wz\" (UID: \"26bdb492-c2c3-48e7-b86a-b83cb2f4aea5\") " pod="openshift-infra/auto-csr-approver-29568982-7k9wz" Mar 22 00:22:00 crc kubenswrapper[5116]: I0322 00:22:00.594667 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwpfq\" (UniqueName: \"kubernetes.io/projected/26bdb492-c2c3-48e7-b86a-b83cb2f4aea5-kube-api-access-zwpfq\") pod \"auto-csr-approver-29568982-7k9wz\" (UID: \"26bdb492-c2c3-48e7-b86a-b83cb2f4aea5\") " pod="openshift-infra/auto-csr-approver-29568982-7k9wz" Mar 22 00:22:00 crc kubenswrapper[5116]: I0322 00:22:00.642682 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568982-7k9wz" Mar 22 00:22:00 crc kubenswrapper[5116]: W0322 00:22:00.905659 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26bdb492_c2c3_48e7_b86a_b83cb2f4aea5.slice/crio-a9b54c8c3de25a3ab435a904b2a7eb72157c27770de39ea6ea2638fac5302db6 WatchSource:0}: Error finding container a9b54c8c3de25a3ab435a904b2a7eb72157c27770de39ea6ea2638fac5302db6: Status 404 returned error can't find the container with id a9b54c8c3de25a3ab435a904b2a7eb72157c27770de39ea6ea2638fac5302db6 Mar 22 00:22:00 crc kubenswrapper[5116]: I0322 00:22:00.905820 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568982-7k9wz"] Mar 22 00:22:01 crc kubenswrapper[5116]: I0322 00:22:01.878766 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568982-7k9wz" event={"ID":"26bdb492-c2c3-48e7-b86a-b83cb2f4aea5","Type":"ContainerStarted","Data":"a9b54c8c3de25a3ab435a904b2a7eb72157c27770de39ea6ea2638fac5302db6"} Mar 22 00:22:02 crc kubenswrapper[5116]: I0322 00:22:02.887811 5116 generic.go:358] "Generic (PLEG): container finished" podID="26bdb492-c2c3-48e7-b86a-b83cb2f4aea5" containerID="2ddc5261a5a71c38bc0367d296e6678e1f3648ba5a05ce953a8407e9e3ce8a74" exitCode=0 Mar 22 00:22:02 crc kubenswrapper[5116]: I0322 00:22:02.887885 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568982-7k9wz" event={"ID":"26bdb492-c2c3-48e7-b86a-b83cb2f4aea5","Type":"ContainerDied","Data":"2ddc5261a5a71c38bc0367d296e6678e1f3648ba5a05ce953a8407e9e3ce8a74"} Mar 22 00:22:04 crc kubenswrapper[5116]: I0322 00:22:04.160931 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568982-7k9wz" Mar 22 00:22:04 crc kubenswrapper[5116]: I0322 00:22:04.228392 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwpfq\" (UniqueName: \"kubernetes.io/projected/26bdb492-c2c3-48e7-b86a-b83cb2f4aea5-kube-api-access-zwpfq\") pod \"26bdb492-c2c3-48e7-b86a-b83cb2f4aea5\" (UID: \"26bdb492-c2c3-48e7-b86a-b83cb2f4aea5\") " Mar 22 00:22:04 crc kubenswrapper[5116]: I0322 00:22:04.235908 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26bdb492-c2c3-48e7-b86a-b83cb2f4aea5-kube-api-access-zwpfq" (OuterVolumeSpecName: "kube-api-access-zwpfq") pod "26bdb492-c2c3-48e7-b86a-b83cb2f4aea5" (UID: "26bdb492-c2c3-48e7-b86a-b83cb2f4aea5"). InnerVolumeSpecName "kube-api-access-zwpfq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:22:04 crc kubenswrapper[5116]: I0322 00:22:04.329642 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zwpfq\" (UniqueName: \"kubernetes.io/projected/26bdb492-c2c3-48e7-b86a-b83cb2f4aea5-kube-api-access-zwpfq\") on node \"crc\" DevicePath \"\"" Mar 22 00:22:04 crc kubenswrapper[5116]: I0322 00:22:04.904334 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568982-7k9wz" event={"ID":"26bdb492-c2c3-48e7-b86a-b83cb2f4aea5","Type":"ContainerDied","Data":"a9b54c8c3de25a3ab435a904b2a7eb72157c27770de39ea6ea2638fac5302db6"} Mar 22 00:22:04 crc kubenswrapper[5116]: I0322 00:22:04.904386 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9b54c8c3de25a3ab435a904b2a7eb72157c27770de39ea6ea2638fac5302db6" Mar 22 00:22:04 crc kubenswrapper[5116]: I0322 00:22:04.905073 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568982-7k9wz" Mar 22 00:22:05 crc kubenswrapper[5116]: I0322 00:22:05.239994 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29568976-b4g9d"] Mar 22 00:22:05 crc kubenswrapper[5116]: I0322 00:22:05.245963 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29568976-b4g9d"] Mar 22 00:22:05 crc kubenswrapper[5116]: I0322 00:22:05.706479 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="832c911e-4692-4912-8df4-880e98e4c2c1" path="/var/lib/kubelet/pods/832c911e-4692-4912-8df4-880e98e4c2c1/volumes" Mar 22 00:22:23 crc kubenswrapper[5116]: I0322 00:22:23.057525 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:22:23 crc kubenswrapper[5116]: I0322 00:22:23.058235 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:22:23 crc kubenswrapper[5116]: I0322 00:22:23.058325 5116 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:22:23 crc kubenswrapper[5116]: I0322 00:22:23.059391 5116 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2bace879b3cb84a0484101d5bad3c5693b65e8c5fa47a655bcdd4b8fee4ab4a2"} pod="openshift-machine-config-operator/machine-config-daemon-66g6d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 22 00:22:23 crc kubenswrapper[5116]: I0322 00:22:23.059516 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" containerID="cri-o://2bace879b3cb84a0484101d5bad3c5693b65e8c5fa47a655bcdd4b8fee4ab4a2" gracePeriod=600 Mar 22 00:22:24 crc kubenswrapper[5116]: I0322 00:22:24.194423 5116 generic.go:358] "Generic (PLEG): container finished" podID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerID="2bace879b3cb84a0484101d5bad3c5693b65e8c5fa47a655bcdd4b8fee4ab4a2" exitCode=0 Mar 22 00:22:24 crc kubenswrapper[5116]: I0322 00:22:24.194515 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerDied","Data":"2bace879b3cb84a0484101d5bad3c5693b65e8c5fa47a655bcdd4b8fee4ab4a2"} Mar 22 00:22:24 crc kubenswrapper[5116]: I0322 00:22:24.195458 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerStarted","Data":"04d3547ec13e400d85bea3193ce33dfe9d5bb94e3b6f8025eabfb2cf4e55b029"} Mar 22 00:22:24 crc kubenswrapper[5116]: I0322 00:22:24.195486 5116 scope.go:117] "RemoveContainer" containerID="29fe2544ad2992d4d45322cba6cd2af82b6ecb0aef833ef494e780ff8287385e" Mar 22 00:22:50 crc kubenswrapper[5116]: I0322 00:22:50.420772 5116 scope.go:117] "RemoveContainer" containerID="9cfe6ad0080f9bd011bb482561dcac74a7fc0e16adff6a8d4fce7c2e783aaf6b" Mar 22 00:23:14 crc kubenswrapper[5116]: I0322 00:23:14.565970 5116 generic.go:358] "Generic (PLEG): container finished" podID="b5b6453f-08b2-400a-8db6-b042778914e1" containerID="9f80dd160f49863da97b4679eccf90a67796405074ac221d7cb0f7d3175f7f14" exitCode=0 Mar 22 00:23:14 crc kubenswrapper[5116]: I0322 00:23:14.566034 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"b5b6453f-08b2-400a-8db6-b042778914e1","Type":"ContainerDied","Data":"9f80dd160f49863da97b4679eccf90a67796405074ac221d7cb0f7d3175f7f14"} Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.833502 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.962902 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/b5b6453f-08b2-400a-8db6-b042778914e1-builder-dockercfg-qv5f4-push\") pod \"b5b6453f-08b2-400a-8db6-b042778914e1\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.962969 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/b5b6453f-08b2-400a-8db6-b042778914e1-builder-dockercfg-qv5f4-pull\") pod \"b5b6453f-08b2-400a-8db6-b042778914e1\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.962991 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-container-storage-root\") pod \"b5b6453f-08b2-400a-8db6-b042778914e1\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.963011 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b5b6453f-08b2-400a-8db6-b042778914e1-node-pullsecrets\") pod \"b5b6453f-08b2-400a-8db6-b042778914e1\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.963028 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c724g\" (UniqueName: \"kubernetes.io/projected/b5b6453f-08b2-400a-8db6-b042778914e1-kube-api-access-c724g\") pod \"b5b6453f-08b2-400a-8db6-b042778914e1\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.963069 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-build-blob-cache\") pod \"b5b6453f-08b2-400a-8db6-b042778914e1\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.963101 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-system-configs\") pod \"b5b6453f-08b2-400a-8db6-b042778914e1\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.963144 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-buildworkdir\") pod \"b5b6453f-08b2-400a-8db6-b042778914e1\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.963159 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-ca-bundles\") pod \"b5b6453f-08b2-400a-8db6-b042778914e1\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.963218 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b5b6453f-08b2-400a-8db6-b042778914e1-buildcachedir\") pod \"b5b6453f-08b2-400a-8db6-b042778914e1\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.963215 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5b6453f-08b2-400a-8db6-b042778914e1-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "b5b6453f-08b2-400a-8db6-b042778914e1" (UID: "b5b6453f-08b2-400a-8db6-b042778914e1"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.963313 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-proxy-ca-bundles\") pod \"b5b6453f-08b2-400a-8db6-b042778914e1\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.963331 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-container-storage-run\") pod \"b5b6453f-08b2-400a-8db6-b042778914e1\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.963529 5116 reconciler_common.go:299] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b5b6453f-08b2-400a-8db6-b042778914e1-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.963994 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5b6453f-08b2-400a-8db6-b042778914e1-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "b5b6453f-08b2-400a-8db6-b042778914e1" (UID: "b5b6453f-08b2-400a-8db6-b042778914e1"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.964496 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "b5b6453f-08b2-400a-8db6-b042778914e1" (UID: "b5b6453f-08b2-400a-8db6-b042778914e1"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.964893 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "b5b6453f-08b2-400a-8db6-b042778914e1" (UID: "b5b6453f-08b2-400a-8db6-b042778914e1"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.964939 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "b5b6453f-08b2-400a-8db6-b042778914e1" (UID: "b5b6453f-08b2-400a-8db6-b042778914e1"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.965419 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "b5b6453f-08b2-400a-8db6-b042778914e1" (UID: "b5b6453f-08b2-400a-8db6-b042778914e1"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.968988 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5b6453f-08b2-400a-8db6-b042778914e1-kube-api-access-c724g" (OuterVolumeSpecName: "kube-api-access-c724g") pod "b5b6453f-08b2-400a-8db6-b042778914e1" (UID: "b5b6453f-08b2-400a-8db6-b042778914e1"). InnerVolumeSpecName "kube-api-access-c724g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.969343 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5b6453f-08b2-400a-8db6-b042778914e1-builder-dockercfg-qv5f4-pull" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-pull") pod "b5b6453f-08b2-400a-8db6-b042778914e1" (UID: "b5b6453f-08b2-400a-8db6-b042778914e1"). InnerVolumeSpecName "builder-dockercfg-qv5f4-pull". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.971205 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5b6453f-08b2-400a-8db6-b042778914e1-builder-dockercfg-qv5f4-push" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-push") pod "b5b6453f-08b2-400a-8db6-b042778914e1" (UID: "b5b6453f-08b2-400a-8db6-b042778914e1"). InnerVolumeSpecName "builder-dockercfg-qv5f4-push". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.997683 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "b5b6453f-08b2-400a-8db6-b042778914e1" (UID: "b5b6453f-08b2-400a-8db6-b042778914e1"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:23:16 crc kubenswrapper[5116]: I0322 00:23:16.064534 5116 reconciler_common.go:299] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:16 crc kubenswrapper[5116]: I0322 00:23:16.064581 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:16 crc kubenswrapper[5116]: I0322 00:23:16.064600 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/b5b6453f-08b2-400a-8db6-b042778914e1-builder-dockercfg-qv5f4-push\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:16 crc kubenswrapper[5116]: I0322 00:23:16.064616 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/b5b6453f-08b2-400a-8db6-b042778914e1-builder-dockercfg-qv5f4-pull\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:16 crc kubenswrapper[5116]: I0322 00:23:16.064627 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c724g\" (UniqueName: \"kubernetes.io/projected/b5b6453f-08b2-400a-8db6-b042778914e1-kube-api-access-c724g\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:16 crc kubenswrapper[5116]: I0322 00:23:16.064640 5116 reconciler_common.go:299] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:16 crc kubenswrapper[5116]: I0322 00:23:16.064652 5116 reconciler_common.go:299] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:16 crc kubenswrapper[5116]: I0322 00:23:16.064664 5116 reconciler_common.go:299] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:16 crc kubenswrapper[5116]: I0322 00:23:16.064676 5116 reconciler_common.go:299] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b5b6453f-08b2-400a-8db6-b042778914e1-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:16 crc kubenswrapper[5116]: I0322 00:23:16.124081 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "b5b6453f-08b2-400a-8db6-b042778914e1" (UID: "b5b6453f-08b2-400a-8db6-b042778914e1"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:23:16 crc kubenswrapper[5116]: I0322 00:23:16.165738 5116 reconciler_common.go:299] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:16 crc kubenswrapper[5116]: I0322 00:23:16.584555 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"b5b6453f-08b2-400a-8db6-b042778914e1","Type":"ContainerDied","Data":"4bff84bbfff058bf2ecd5285af8c9b30ed4caa49bd5b3b83f0e15af519300215"} Mar 22 00:23:16 crc kubenswrapper[5116]: I0322 00:23:16.584978 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bff84bbfff058bf2ecd5285af8c9b30ed4caa49bd5b3b83f0e15af519300215" Mar 22 00:23:16 crc kubenswrapper[5116]: I0322 00:23:16.584586 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:23:17 crc kubenswrapper[5116]: I0322 00:23:17.968635 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "b5b6453f-08b2-400a-8db6-b042778914e1" (UID: "b5b6453f-08b2-400a-8db6-b042778914e1"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:23:17 crc kubenswrapper[5116]: I0322 00:23:17.995585 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.565610 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.566700 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5b6453f-08b2-400a-8db6-b042778914e1" containerName="git-clone" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.566716 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b6453f-08b2-400a-8db6-b042778914e1" containerName="git-clone" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.566734 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5b6453f-08b2-400a-8db6-b042778914e1" containerName="docker-build" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.566739 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b6453f-08b2-400a-8db6-b042778914e1" containerName="docker-build" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.566752 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26bdb492-c2c3-48e7-b86a-b83cb2f4aea5" containerName="oc" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.566758 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="26bdb492-c2c3-48e7-b86a-b83cb2f4aea5" containerName="oc" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.566770 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5b6453f-08b2-400a-8db6-b042778914e1" containerName="manage-dockerfile" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.566775 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b6453f-08b2-400a-8db6-b042778914e1" containerName="manage-dockerfile" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.566882 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5b6453f-08b2-400a-8db6-b042778914e1" containerName="docker-build" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.566893 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="26bdb492-c2c3-48e7-b86a-b83cb2f4aea5" containerName="oc" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.841275 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.841609 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.844046 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"smart-gateway-operator-1-global-ca\"" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.844305 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"smart-gateway-operator-1-sys-config\"" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.844615 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"builder-dockercfg-qv5f4\"" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.845815 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"smart-gateway-operator-1-ca\"" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.939802 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.939866 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.939896 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5e320997-981e-4e89-992a-8ad6f1f6099a-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.939914 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm884\" (UniqueName: \"kubernetes.io/projected/5e320997-981e-4e89-992a-8ad6f1f6099a-kube-api-access-vm884\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.939978 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/5e320997-981e-4e89-992a-8ad6f1f6099a-builder-dockercfg-qv5f4-push\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.940028 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/5e320997-981e-4e89-992a-8ad6f1f6099a-builder-dockercfg-qv5f4-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.940130 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.940269 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.940308 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.940380 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.940433 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5e320997-981e-4e89-992a-8ad6f1f6099a-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.940470 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.041404 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.041468 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.041509 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5e320997-981e-4e89-992a-8ad6f1f6099a-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.041531 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vm884\" (UniqueName: \"kubernetes.io/projected/5e320997-981e-4e89-992a-8ad6f1f6099a-kube-api-access-vm884\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.041553 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/5e320997-981e-4e89-992a-8ad6f1f6099a-builder-dockercfg-qv5f4-push\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.041586 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/5e320997-981e-4e89-992a-8ad6f1f6099a-builder-dockercfg-qv5f4-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.041614 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.041626 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5e320997-981e-4e89-992a-8ad6f1f6099a-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.041648 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.041881 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.042232 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.042263 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.042428 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.042453 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5e320997-981e-4e89-992a-8ad6f1f6099a-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.042747 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.042619 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5e320997-981e-4e89-992a-8ad6f1f6099a-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.042801 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.042784 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.043484 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.043519 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.043542 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.049713 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/5e320997-981e-4e89-992a-8ad6f1f6099a-builder-dockercfg-qv5f4-push\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.049749 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/5e320997-981e-4e89-992a-8ad6f1f6099a-builder-dockercfg-qv5f4-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.061970 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm884\" (UniqueName: \"kubernetes.io/projected/5e320997-981e-4e89-992a-8ad6f1f6099a-kube-api-access-vm884\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.165310 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.585077 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.623832 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"5e320997-981e-4e89-992a-8ad6f1f6099a","Type":"ContainerStarted","Data":"217c472cfa6d0d26fe99bb6287f96ab85531e90f295a41a345b695b66937057a"} Mar 22 00:23:22 crc kubenswrapper[5116]: I0322 00:23:22.632388 5116 generic.go:358] "Generic (PLEG): container finished" podID="5e320997-981e-4e89-992a-8ad6f1f6099a" containerID="4fc1945949f4a48bc881b27cfbe1bdcda0b53a37ec4eb1a6d836aed83950ab03" exitCode=0 Mar 22 00:23:22 crc kubenswrapper[5116]: I0322 00:23:22.632487 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"5e320997-981e-4e89-992a-8ad6f1f6099a","Type":"ContainerDied","Data":"4fc1945949f4a48bc881b27cfbe1bdcda0b53a37ec4eb1a6d836aed83950ab03"} Mar 22 00:23:23 crc kubenswrapper[5116]: I0322 00:23:23.642728 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"5e320997-981e-4e89-992a-8ad6f1f6099a","Type":"ContainerStarted","Data":"6bf6460e54958e2605944c6c4c0ee5bdf476b98e4d7da44d036a7cf3c7d4a782"} Mar 22 00:23:23 crc kubenswrapper[5116]: I0322 00:23:23.674678 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-1-build" podStartSLOduration=3.674658504 podStartE2EDuration="3.674658504s" podCreationTimestamp="2026-03-22 00:23:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:23:23.671671907 +0000 UTC m=+874.693973310" watchObservedRunningTime="2026-03-22 00:23:23.674658504 +0000 UTC m=+874.696959887" Mar 22 00:23:31 crc kubenswrapper[5116]: I0322 00:23:31.192045 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 22 00:23:31 crc kubenswrapper[5116]: I0322 00:23:31.193292 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="service-telemetry/smart-gateway-operator-1-build" podUID="5e320997-981e-4e89-992a-8ad6f1f6099a" containerName="docker-build" containerID="cri-o://6bf6460e54958e2605944c6c4c0ee5bdf476b98e4d7da44d036a7cf3c7d4a782" gracePeriod=30 Mar 22 00:23:32 crc kubenswrapper[5116]: I0322 00:23:32.823237 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.268543 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.268745 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.270950 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"smart-gateway-operator-2-sys-config\"" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.271007 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"smart-gateway-operator-2-global-ca\"" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.271218 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"smart-gateway-operator-2-ca\"" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.311113 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.311318 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8a3c0606-0188-415f-9d3f-b6477cda110e-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.311347 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.311469 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.311542 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.311639 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.311672 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2lsb\" (UniqueName: \"kubernetes.io/projected/8a3c0606-0188-415f-9d3f-b6477cda110e-kube-api-access-n2lsb\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.311710 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8a3c0606-0188-415f-9d3f-b6477cda110e-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.311744 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/8a3c0606-0188-415f-9d3f-b6477cda110e-builder-dockercfg-qv5f4-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.311814 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.311876 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/8a3c0606-0188-415f-9d3f-b6477cda110e-builder-dockercfg-qv5f4-push\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.311961 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.415763 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.415819 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/8a3c0606-0188-415f-9d3f-b6477cda110e-builder-dockercfg-qv5f4-push\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.415844 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.415892 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.415912 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8a3c0606-0188-415f-9d3f-b6477cda110e-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.415931 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.415955 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.415973 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.415995 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.416017 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n2lsb\" (UniqueName: \"kubernetes.io/projected/8a3c0606-0188-415f-9d3f-b6477cda110e-kube-api-access-n2lsb\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.416044 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8a3c0606-0188-415f-9d3f-b6477cda110e-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.416304 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8a3c0606-0188-415f-9d3f-b6477cda110e-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.416343 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.416386 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8a3c0606-0188-415f-9d3f-b6477cda110e-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.416597 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/8a3c0606-0188-415f-9d3f-b6477cda110e-builder-dockercfg-qv5f4-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.416680 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.416682 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.416611 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.416823 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.416888 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.417865 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.421518 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/8a3c0606-0188-415f-9d3f-b6477cda110e-builder-dockercfg-qv5f4-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.421543 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/8a3c0606-0188-415f-9d3f-b6477cda110e-builder-dockercfg-qv5f4-push\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.434105 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2lsb\" (UniqueName: \"kubernetes.io/projected/8a3c0606-0188-415f-9d3f-b6477cda110e-kube-api-access-n2lsb\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.587017 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.727387 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_5e320997-981e-4e89-992a-8ad6f1f6099a/docker-build/0.log" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.729228 5116 generic.go:358] "Generic (PLEG): container finished" podID="5e320997-981e-4e89-992a-8ad6f1f6099a" containerID="6bf6460e54958e2605944c6c4c0ee5bdf476b98e4d7da44d036a7cf3c7d4a782" exitCode=1 Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.729783 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"5e320997-981e-4e89-992a-8ad6f1f6099a","Type":"ContainerDied","Data":"6bf6460e54958e2605944c6c4c0ee5bdf476b98e4d7da44d036a7cf3c7d4a782"} Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.031231 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.253101 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_5e320997-981e-4e89-992a-8ad6f1f6099a/docker-build/0.log" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.253548 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.332419 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-build-blob-cache\") pod \"5e320997-981e-4e89-992a-8ad6f1f6099a\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.332885 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-proxy-ca-bundles\") pod \"5e320997-981e-4e89-992a-8ad6f1f6099a\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.332968 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5e320997-981e-4e89-992a-8ad6f1f6099a-buildcachedir\") pod \"5e320997-981e-4e89-992a-8ad6f1f6099a\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.333008 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5e320997-981e-4e89-992a-8ad6f1f6099a-node-pullsecrets\") pod \"5e320997-981e-4e89-992a-8ad6f1f6099a\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.333053 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-ca-bundles\") pod \"5e320997-981e-4e89-992a-8ad6f1f6099a\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.333070 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e320997-981e-4e89-992a-8ad6f1f6099a-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "5e320997-981e-4e89-992a-8ad6f1f6099a" (UID: "5e320997-981e-4e89-992a-8ad6f1f6099a"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.333105 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-system-configs\") pod \"5e320997-981e-4e89-992a-8ad6f1f6099a\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.333140 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e320997-981e-4e89-992a-8ad6f1f6099a-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "5e320997-981e-4e89-992a-8ad6f1f6099a" (UID: "5e320997-981e-4e89-992a-8ad6f1f6099a"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.333724 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/5e320997-981e-4e89-992a-8ad6f1f6099a-builder-dockercfg-qv5f4-pull\") pod \"5e320997-981e-4e89-992a-8ad6f1f6099a\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.333793 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/5e320997-981e-4e89-992a-8ad6f1f6099a-builder-dockercfg-qv5f4-push\") pod \"5e320997-981e-4e89-992a-8ad6f1f6099a\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.333825 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-container-storage-run\") pod \"5e320997-981e-4e89-992a-8ad6f1f6099a\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.333898 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm884\" (UniqueName: \"kubernetes.io/projected/5e320997-981e-4e89-992a-8ad6f1f6099a-kube-api-access-vm884\") pod \"5e320997-981e-4e89-992a-8ad6f1f6099a\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.333936 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-buildworkdir\") pod \"5e320997-981e-4e89-992a-8ad6f1f6099a\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.333974 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-container-storage-root\") pod \"5e320997-981e-4e89-992a-8ad6f1f6099a\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.334467 5116 reconciler_common.go:299] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5e320997-981e-4e89-992a-8ad6f1f6099a-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.334483 5116 reconciler_common.go:299] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5e320997-981e-4e89-992a-8ad6f1f6099a-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.333969 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "5e320997-981e-4e89-992a-8ad6f1f6099a" (UID: "5e320997-981e-4e89-992a-8ad6f1f6099a"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.334584 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "5e320997-981e-4e89-992a-8ad6f1f6099a" (UID: "5e320997-981e-4e89-992a-8ad6f1f6099a"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.334856 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "5e320997-981e-4e89-992a-8ad6f1f6099a" (UID: "5e320997-981e-4e89-992a-8ad6f1f6099a"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.335188 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "5e320997-981e-4e89-992a-8ad6f1f6099a" (UID: "5e320997-981e-4e89-992a-8ad6f1f6099a"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.335431 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "5e320997-981e-4e89-992a-8ad6f1f6099a" (UID: "5e320997-981e-4e89-992a-8ad6f1f6099a"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.335862 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "5e320997-981e-4e89-992a-8ad6f1f6099a" (UID: "5e320997-981e-4e89-992a-8ad6f1f6099a"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.339033 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e320997-981e-4e89-992a-8ad6f1f6099a-builder-dockercfg-qv5f4-push" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-push") pod "5e320997-981e-4e89-992a-8ad6f1f6099a" (UID: "5e320997-981e-4e89-992a-8ad6f1f6099a"). InnerVolumeSpecName "builder-dockercfg-qv5f4-push". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.339066 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e320997-981e-4e89-992a-8ad6f1f6099a-builder-dockercfg-qv5f4-pull" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-pull") pod "5e320997-981e-4e89-992a-8ad6f1f6099a" (UID: "5e320997-981e-4e89-992a-8ad6f1f6099a"). InnerVolumeSpecName "builder-dockercfg-qv5f4-pull". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.339316 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e320997-981e-4e89-992a-8ad6f1f6099a-kube-api-access-vm884" (OuterVolumeSpecName: "kube-api-access-vm884") pod "5e320997-981e-4e89-992a-8ad6f1f6099a" (UID: "5e320997-981e-4e89-992a-8ad6f1f6099a"). InnerVolumeSpecName "kube-api-access-vm884". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.436066 5116 reconciler_common.go:299] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.436109 5116 reconciler_common.go:299] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.436124 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/5e320997-981e-4e89-992a-8ad6f1f6099a-builder-dockercfg-qv5f4-pull\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.436136 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/5e320997-981e-4e89-992a-8ad6f1f6099a-builder-dockercfg-qv5f4-push\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.436151 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.436179 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vm884\" (UniqueName: \"kubernetes.io/projected/5e320997-981e-4e89-992a-8ad6f1f6099a-kube-api-access-vm884\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.436194 5116 reconciler_common.go:299] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.436205 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.436216 5116 reconciler_common.go:299] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.478669 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "5e320997-981e-4e89-992a-8ad6f1f6099a" (UID: "5e320997-981e-4e89-992a-8ad6f1f6099a"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.537065 5116 reconciler_common.go:299] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.749922 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"8a3c0606-0188-415f-9d3f-b6477cda110e","Type":"ContainerStarted","Data":"d347e1d1d0bfa441475d985bf9ba23296fad7e939d0b5cd81d6c7a37f1fc4343"} Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.749994 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"8a3c0606-0188-415f-9d3f-b6477cda110e","Type":"ContainerStarted","Data":"ee2acea936ad827175b72459e0f3f311a25df509cce538719e55f747534cdcb7"} Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.755115 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_5e320997-981e-4e89-992a-8ad6f1f6099a/docker-build/0.log" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.755948 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"5e320997-981e-4e89-992a-8ad6f1f6099a","Type":"ContainerDied","Data":"217c472cfa6d0d26fe99bb6287f96ab85531e90f295a41a345b695b66937057a"} Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.755959 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.756046 5116 scope.go:117] "RemoveContainer" containerID="6bf6460e54958e2605944c6c4c0ee5bdf476b98e4d7da44d036a7cf3c7d4a782" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.839338 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.848053 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.853459 5116 scope.go:117] "RemoveContainer" containerID="4fc1945949f4a48bc881b27cfbe1bdcda0b53a37ec4eb1a6d836aed83950ab03" Mar 22 00:23:35 crc kubenswrapper[5116]: I0322 00:23:35.704314 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e320997-981e-4e89-992a-8ad6f1f6099a" path="/var/lib/kubelet/pods/5e320997-981e-4e89-992a-8ad6f1f6099a/volumes" Mar 22 00:23:35 crc kubenswrapper[5116]: I0322 00:23:35.763035 5116 generic.go:358] "Generic (PLEG): container finished" podID="8a3c0606-0188-415f-9d3f-b6477cda110e" containerID="d347e1d1d0bfa441475d985bf9ba23296fad7e939d0b5cd81d6c7a37f1fc4343" exitCode=0 Mar 22 00:23:35 crc kubenswrapper[5116]: I0322 00:23:35.763133 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"8a3c0606-0188-415f-9d3f-b6477cda110e","Type":"ContainerDied","Data":"d347e1d1d0bfa441475d985bf9ba23296fad7e939d0b5cd81d6c7a37f1fc4343"} Mar 22 00:23:36 crc kubenswrapper[5116]: I0322 00:23:36.774219 5116 generic.go:358] "Generic (PLEG): container finished" podID="8a3c0606-0188-415f-9d3f-b6477cda110e" containerID="2f35ca229ed42938e875630eca11bd2baec770d2b9a943980d5198c7d2dd43fb" exitCode=0 Mar 22 00:23:36 crc kubenswrapper[5116]: I0322 00:23:36.774340 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"8a3c0606-0188-415f-9d3f-b6477cda110e","Type":"ContainerDied","Data":"2f35ca229ed42938e875630eca11bd2baec770d2b9a943980d5198c7d2dd43fb"} Mar 22 00:23:36 crc kubenswrapper[5116]: I0322 00:23:36.808498 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-2-build_8a3c0606-0188-415f-9d3f-b6477cda110e/manage-dockerfile/0.log" Mar 22 00:23:37 crc kubenswrapper[5116]: I0322 00:23:37.785694 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"8a3c0606-0188-415f-9d3f-b6477cda110e","Type":"ContainerStarted","Data":"3a936d4064aa62dbd3538b2212d49894af70791a2665eec39925b6255ecb5c6b"} Mar 22 00:23:37 crc kubenswrapper[5116]: I0322 00:23:37.819006 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-2-build" podStartSLOduration=5.818983162 podStartE2EDuration="5.818983162s" podCreationTimestamp="2026-03-22 00:23:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:23:37.817090879 +0000 UTC m=+888.839392252" watchObservedRunningTime="2026-03-22 00:23:37.818983162 +0000 UTC m=+888.841284565" Mar 22 00:23:43 crc kubenswrapper[5116]: I0322 00:23:43.710198 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wmsdh"] Mar 22 00:23:43 crc kubenswrapper[5116]: I0322 00:23:43.711658 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e320997-981e-4e89-992a-8ad6f1f6099a" containerName="docker-build" Mar 22 00:23:43 crc kubenswrapper[5116]: I0322 00:23:43.711677 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e320997-981e-4e89-992a-8ad6f1f6099a" containerName="docker-build" Mar 22 00:23:43 crc kubenswrapper[5116]: I0322 00:23:43.711695 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e320997-981e-4e89-992a-8ad6f1f6099a" containerName="manage-dockerfile" Mar 22 00:23:43 crc kubenswrapper[5116]: I0322 00:23:43.711704 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e320997-981e-4e89-992a-8ad6f1f6099a" containerName="manage-dockerfile" Mar 22 00:23:43 crc kubenswrapper[5116]: I0322 00:23:43.711861 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e320997-981e-4e89-992a-8ad6f1f6099a" containerName="docker-build" Mar 22 00:23:44 crc kubenswrapper[5116]: I0322 00:23:44.560000 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wmsdh"] Mar 22 00:23:44 crc kubenswrapper[5116]: I0322 00:23:44.560240 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmsdh" Mar 22 00:23:44 crc kubenswrapper[5116]: I0322 00:23:44.692332 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7644c42a-78ae-476f-84f8-2f1d9372f921-catalog-content\") pod \"community-operators-wmsdh\" (UID: \"7644c42a-78ae-476f-84f8-2f1d9372f921\") " pod="openshift-marketplace/community-operators-wmsdh" Mar 22 00:23:44 crc kubenswrapper[5116]: I0322 00:23:44.692415 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7644c42a-78ae-476f-84f8-2f1d9372f921-utilities\") pod \"community-operators-wmsdh\" (UID: \"7644c42a-78ae-476f-84f8-2f1d9372f921\") " pod="openshift-marketplace/community-operators-wmsdh" Mar 22 00:23:44 crc kubenswrapper[5116]: I0322 00:23:44.692517 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lbm8\" (UniqueName: \"kubernetes.io/projected/7644c42a-78ae-476f-84f8-2f1d9372f921-kube-api-access-7lbm8\") pod \"community-operators-wmsdh\" (UID: \"7644c42a-78ae-476f-84f8-2f1d9372f921\") " pod="openshift-marketplace/community-operators-wmsdh" Mar 22 00:23:44 crc kubenswrapper[5116]: I0322 00:23:44.793953 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7644c42a-78ae-476f-84f8-2f1d9372f921-utilities\") pod \"community-operators-wmsdh\" (UID: \"7644c42a-78ae-476f-84f8-2f1d9372f921\") " pod="openshift-marketplace/community-operators-wmsdh" Mar 22 00:23:44 crc kubenswrapper[5116]: I0322 00:23:44.794141 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lbm8\" (UniqueName: \"kubernetes.io/projected/7644c42a-78ae-476f-84f8-2f1d9372f921-kube-api-access-7lbm8\") pod \"community-operators-wmsdh\" (UID: \"7644c42a-78ae-476f-84f8-2f1d9372f921\") " pod="openshift-marketplace/community-operators-wmsdh" Mar 22 00:23:44 crc kubenswrapper[5116]: I0322 00:23:44.794772 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7644c42a-78ae-476f-84f8-2f1d9372f921-catalog-content\") pod \"community-operators-wmsdh\" (UID: \"7644c42a-78ae-476f-84f8-2f1d9372f921\") " pod="openshift-marketplace/community-operators-wmsdh" Mar 22 00:23:44 crc kubenswrapper[5116]: I0322 00:23:44.794923 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7644c42a-78ae-476f-84f8-2f1d9372f921-utilities\") pod \"community-operators-wmsdh\" (UID: \"7644c42a-78ae-476f-84f8-2f1d9372f921\") " pod="openshift-marketplace/community-operators-wmsdh" Mar 22 00:23:44 crc kubenswrapper[5116]: I0322 00:23:44.795398 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7644c42a-78ae-476f-84f8-2f1d9372f921-catalog-content\") pod \"community-operators-wmsdh\" (UID: \"7644c42a-78ae-476f-84f8-2f1d9372f921\") " pod="openshift-marketplace/community-operators-wmsdh" Mar 22 00:23:44 crc kubenswrapper[5116]: I0322 00:23:44.817460 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lbm8\" (UniqueName: \"kubernetes.io/projected/7644c42a-78ae-476f-84f8-2f1d9372f921-kube-api-access-7lbm8\") pod \"community-operators-wmsdh\" (UID: \"7644c42a-78ae-476f-84f8-2f1d9372f921\") " pod="openshift-marketplace/community-operators-wmsdh" Mar 22 00:23:44 crc kubenswrapper[5116]: I0322 00:23:44.892552 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmsdh" Mar 22 00:23:45 crc kubenswrapper[5116]: I0322 00:23:45.129148 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wmsdh"] Mar 22 00:23:45 crc kubenswrapper[5116]: I0322 00:23:45.835157 5116 generic.go:358] "Generic (PLEG): container finished" podID="7644c42a-78ae-476f-84f8-2f1d9372f921" containerID="a9b82b5e1637db60c44ebfe672bf8988270a6be31331481f933ad65e5b492b23" exitCode=0 Mar 22 00:23:45 crc kubenswrapper[5116]: I0322 00:23:45.835374 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmsdh" event={"ID":"7644c42a-78ae-476f-84f8-2f1d9372f921","Type":"ContainerDied","Data":"a9b82b5e1637db60c44ebfe672bf8988270a6be31331481f933ad65e5b492b23"} Mar 22 00:23:45 crc kubenswrapper[5116]: I0322 00:23:45.835400 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmsdh" event={"ID":"7644c42a-78ae-476f-84f8-2f1d9372f921","Type":"ContainerStarted","Data":"1e631c427ba65a808a4fa4d0e57e810ac41d1ca777f6d8d7f029e3a9a37fd05a"} Mar 22 00:23:47 crc kubenswrapper[5116]: I0322 00:23:47.853259 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmsdh" event={"ID":"7644c42a-78ae-476f-84f8-2f1d9372f921","Type":"ContainerStarted","Data":"0de476c313957e9bfebcc63b096215d58335b13c38937717bbd2cebe12d00e2a"} Mar 22 00:23:48 crc kubenswrapper[5116]: I0322 00:23:48.861628 5116 generic.go:358] "Generic (PLEG): container finished" podID="7644c42a-78ae-476f-84f8-2f1d9372f921" containerID="0de476c313957e9bfebcc63b096215d58335b13c38937717bbd2cebe12d00e2a" exitCode=0 Mar 22 00:23:48 crc kubenswrapper[5116]: I0322 00:23:48.861740 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmsdh" event={"ID":"7644c42a-78ae-476f-84f8-2f1d9372f921","Type":"ContainerDied","Data":"0de476c313957e9bfebcc63b096215d58335b13c38937717bbd2cebe12d00e2a"} Mar 22 00:23:49 crc kubenswrapper[5116]: I0322 00:23:49.868514 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmsdh" event={"ID":"7644c42a-78ae-476f-84f8-2f1d9372f921","Type":"ContainerStarted","Data":"ebe38c553f20e90aa2b7248def912252567dc730fa564a49bdbd668cec1574f6"} Mar 22 00:23:49 crc kubenswrapper[5116]: I0322 00:23:49.891480 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wmsdh" podStartSLOduration=6.197640949 podStartE2EDuration="6.891464091s" podCreationTimestamp="2026-03-22 00:23:43 +0000 UTC" firstStartedPulling="2026-03-22 00:23:46.844407941 +0000 UTC m=+897.866709314" lastFinishedPulling="2026-03-22 00:23:47.538231083 +0000 UTC m=+898.560532456" observedRunningTime="2026-03-22 00:23:49.88745187 +0000 UTC m=+900.909753243" watchObservedRunningTime="2026-03-22 00:23:49.891464091 +0000 UTC m=+900.913765464" Mar 22 00:23:49 crc kubenswrapper[5116]: I0322 00:23:49.980621 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9sq6c_5188f25b-37c3-46f1-b939-199c6e082848/kube-multus/0.log" Mar 22 00:23:49 crc kubenswrapper[5116]: I0322 00:23:49.982948 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9sq6c_5188f25b-37c3-46f1-b939-199c6e082848/kube-multus/0.log" Mar 22 00:23:49 crc kubenswrapper[5116]: I0322 00:23:49.988806 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 22 00:23:49 crc kubenswrapper[5116]: I0322 00:23:49.989785 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 22 00:23:54 crc kubenswrapper[5116]: I0322 00:23:54.893283 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/community-operators-wmsdh" Mar 22 00:23:54 crc kubenswrapper[5116]: I0322 00:23:54.893783 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wmsdh" Mar 22 00:23:54 crc kubenswrapper[5116]: I0322 00:23:54.945189 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wmsdh" Mar 22 00:23:54 crc kubenswrapper[5116]: I0322 00:23:54.978612 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wmsdh" Mar 22 00:23:55 crc kubenswrapper[5116]: I0322 00:23:55.177017 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wmsdh"] Mar 22 00:23:56 crc kubenswrapper[5116]: I0322 00:23:56.924906 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wmsdh" podUID="7644c42a-78ae-476f-84f8-2f1d9372f921" containerName="registry-server" containerID="cri-o://ebe38c553f20e90aa2b7248def912252567dc730fa564a49bdbd668cec1574f6" gracePeriod=2 Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.324005 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmsdh" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.357949 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lbm8\" (UniqueName: \"kubernetes.io/projected/7644c42a-78ae-476f-84f8-2f1d9372f921-kube-api-access-7lbm8\") pod \"7644c42a-78ae-476f-84f8-2f1d9372f921\" (UID: \"7644c42a-78ae-476f-84f8-2f1d9372f921\") " Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.358060 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7644c42a-78ae-476f-84f8-2f1d9372f921-catalog-content\") pod \"7644c42a-78ae-476f-84f8-2f1d9372f921\" (UID: \"7644c42a-78ae-476f-84f8-2f1d9372f921\") " Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.358243 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7644c42a-78ae-476f-84f8-2f1d9372f921-utilities\") pod \"7644c42a-78ae-476f-84f8-2f1d9372f921\" (UID: \"7644c42a-78ae-476f-84f8-2f1d9372f921\") " Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.359156 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7644c42a-78ae-476f-84f8-2f1d9372f921-utilities" (OuterVolumeSpecName: "utilities") pod "7644c42a-78ae-476f-84f8-2f1d9372f921" (UID: "7644c42a-78ae-476f-84f8-2f1d9372f921"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.364670 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7644c42a-78ae-476f-84f8-2f1d9372f921-kube-api-access-7lbm8" (OuterVolumeSpecName: "kube-api-access-7lbm8") pod "7644c42a-78ae-476f-84f8-2f1d9372f921" (UID: "7644c42a-78ae-476f-84f8-2f1d9372f921"). InnerVolumeSpecName "kube-api-access-7lbm8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.413567 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7644c42a-78ae-476f-84f8-2f1d9372f921-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7644c42a-78ae-476f-84f8-2f1d9372f921" (UID: "7644c42a-78ae-476f-84f8-2f1d9372f921"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.460107 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7lbm8\" (UniqueName: \"kubernetes.io/projected/7644c42a-78ae-476f-84f8-2f1d9372f921-kube-api-access-7lbm8\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.460147 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7644c42a-78ae-476f-84f8-2f1d9372f921-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.460159 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7644c42a-78ae-476f-84f8-2f1d9372f921-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.786278 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-42pwh"] Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.787675 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7644c42a-78ae-476f-84f8-2f1d9372f921" containerName="registry-server" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.787697 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="7644c42a-78ae-476f-84f8-2f1d9372f921" containerName="registry-server" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.787716 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7644c42a-78ae-476f-84f8-2f1d9372f921" containerName="extract-utilities" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.787723 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="7644c42a-78ae-476f-84f8-2f1d9372f921" containerName="extract-utilities" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.787736 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7644c42a-78ae-476f-84f8-2f1d9372f921" containerName="extract-content" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.787745 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="7644c42a-78ae-476f-84f8-2f1d9372f921" containerName="extract-content" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.787941 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="7644c42a-78ae-476f-84f8-2f1d9372f921" containerName="registry-server" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.804694 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-42pwh"] Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.804839 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-42pwh" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.865696 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49647030-2576-4187-ba7e-d7514161f53d-catalog-content\") pod \"certified-operators-42pwh\" (UID: \"49647030-2576-4187-ba7e-d7514161f53d\") " pod="openshift-marketplace/certified-operators-42pwh" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.865804 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4qqh\" (UniqueName: \"kubernetes.io/projected/49647030-2576-4187-ba7e-d7514161f53d-kube-api-access-j4qqh\") pod \"certified-operators-42pwh\" (UID: \"49647030-2576-4187-ba7e-d7514161f53d\") " pod="openshift-marketplace/certified-operators-42pwh" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.865868 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49647030-2576-4187-ba7e-d7514161f53d-utilities\") pod \"certified-operators-42pwh\" (UID: \"49647030-2576-4187-ba7e-d7514161f53d\") " pod="openshift-marketplace/certified-operators-42pwh" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.951405 5116 generic.go:358] "Generic (PLEG): container finished" podID="7644c42a-78ae-476f-84f8-2f1d9372f921" containerID="ebe38c553f20e90aa2b7248def912252567dc730fa564a49bdbd668cec1574f6" exitCode=0 Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.951516 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmsdh" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.951776 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmsdh" event={"ID":"7644c42a-78ae-476f-84f8-2f1d9372f921","Type":"ContainerDied","Data":"ebe38c553f20e90aa2b7248def912252567dc730fa564a49bdbd668cec1574f6"} Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.951865 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmsdh" event={"ID":"7644c42a-78ae-476f-84f8-2f1d9372f921","Type":"ContainerDied","Data":"1e631c427ba65a808a4fa4d0e57e810ac41d1ca777f6d8d7f029e3a9a37fd05a"} Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.951889 5116 scope.go:117] "RemoveContainer" containerID="ebe38c553f20e90aa2b7248def912252567dc730fa564a49bdbd668cec1574f6" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.967578 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49647030-2576-4187-ba7e-d7514161f53d-utilities\") pod \"certified-operators-42pwh\" (UID: \"49647030-2576-4187-ba7e-d7514161f53d\") " pod="openshift-marketplace/certified-operators-42pwh" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.967691 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49647030-2576-4187-ba7e-d7514161f53d-catalog-content\") pod \"certified-operators-42pwh\" (UID: \"49647030-2576-4187-ba7e-d7514161f53d\") " pod="openshift-marketplace/certified-operators-42pwh" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.967713 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4qqh\" (UniqueName: \"kubernetes.io/projected/49647030-2576-4187-ba7e-d7514161f53d-kube-api-access-j4qqh\") pod \"certified-operators-42pwh\" (UID: \"49647030-2576-4187-ba7e-d7514161f53d\") " pod="openshift-marketplace/certified-operators-42pwh" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.968262 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49647030-2576-4187-ba7e-d7514161f53d-utilities\") pod \"certified-operators-42pwh\" (UID: \"49647030-2576-4187-ba7e-d7514161f53d\") " pod="openshift-marketplace/certified-operators-42pwh" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.971705 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49647030-2576-4187-ba7e-d7514161f53d-catalog-content\") pod \"certified-operators-42pwh\" (UID: \"49647030-2576-4187-ba7e-d7514161f53d\") " pod="openshift-marketplace/certified-operators-42pwh" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.978199 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wmsdh"] Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.980252 5116 scope.go:117] "RemoveContainer" containerID="0de476c313957e9bfebcc63b096215d58335b13c38937717bbd2cebe12d00e2a" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.984042 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wmsdh"] Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.990084 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4qqh\" (UniqueName: \"kubernetes.io/projected/49647030-2576-4187-ba7e-d7514161f53d-kube-api-access-j4qqh\") pod \"certified-operators-42pwh\" (UID: \"49647030-2576-4187-ba7e-d7514161f53d\") " pod="openshift-marketplace/certified-operators-42pwh" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.998019 5116 scope.go:117] "RemoveContainer" containerID="a9b82b5e1637db60c44ebfe672bf8988270a6be31331481f933ad65e5b492b23" Mar 22 00:23:58 crc kubenswrapper[5116]: I0322 00:23:58.023986 5116 scope.go:117] "RemoveContainer" containerID="ebe38c553f20e90aa2b7248def912252567dc730fa564a49bdbd668cec1574f6" Mar 22 00:23:58 crc kubenswrapper[5116]: E0322 00:23:58.024563 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebe38c553f20e90aa2b7248def912252567dc730fa564a49bdbd668cec1574f6\": container with ID starting with ebe38c553f20e90aa2b7248def912252567dc730fa564a49bdbd668cec1574f6 not found: ID does not exist" containerID="ebe38c553f20e90aa2b7248def912252567dc730fa564a49bdbd668cec1574f6" Mar 22 00:23:58 crc kubenswrapper[5116]: I0322 00:23:58.024605 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebe38c553f20e90aa2b7248def912252567dc730fa564a49bdbd668cec1574f6"} err="failed to get container status \"ebe38c553f20e90aa2b7248def912252567dc730fa564a49bdbd668cec1574f6\": rpc error: code = NotFound desc = could not find container \"ebe38c553f20e90aa2b7248def912252567dc730fa564a49bdbd668cec1574f6\": container with ID starting with ebe38c553f20e90aa2b7248def912252567dc730fa564a49bdbd668cec1574f6 not found: ID does not exist" Mar 22 00:23:58 crc kubenswrapper[5116]: I0322 00:23:58.024631 5116 scope.go:117] "RemoveContainer" containerID="0de476c313957e9bfebcc63b096215d58335b13c38937717bbd2cebe12d00e2a" Mar 22 00:23:58 crc kubenswrapper[5116]: E0322 00:23:58.024957 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0de476c313957e9bfebcc63b096215d58335b13c38937717bbd2cebe12d00e2a\": container with ID starting with 0de476c313957e9bfebcc63b096215d58335b13c38937717bbd2cebe12d00e2a not found: ID does not exist" containerID="0de476c313957e9bfebcc63b096215d58335b13c38937717bbd2cebe12d00e2a" Mar 22 00:23:58 crc kubenswrapper[5116]: I0322 00:23:58.024988 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0de476c313957e9bfebcc63b096215d58335b13c38937717bbd2cebe12d00e2a"} err="failed to get container status \"0de476c313957e9bfebcc63b096215d58335b13c38937717bbd2cebe12d00e2a\": rpc error: code = NotFound desc = could not find container \"0de476c313957e9bfebcc63b096215d58335b13c38937717bbd2cebe12d00e2a\": container with ID starting with 0de476c313957e9bfebcc63b096215d58335b13c38937717bbd2cebe12d00e2a not found: ID does not exist" Mar 22 00:23:58 crc kubenswrapper[5116]: I0322 00:23:58.025007 5116 scope.go:117] "RemoveContainer" containerID="a9b82b5e1637db60c44ebfe672bf8988270a6be31331481f933ad65e5b492b23" Mar 22 00:23:58 crc kubenswrapper[5116]: E0322 00:23:58.025267 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9b82b5e1637db60c44ebfe672bf8988270a6be31331481f933ad65e5b492b23\": container with ID starting with a9b82b5e1637db60c44ebfe672bf8988270a6be31331481f933ad65e5b492b23 not found: ID does not exist" containerID="a9b82b5e1637db60c44ebfe672bf8988270a6be31331481f933ad65e5b492b23" Mar 22 00:23:58 crc kubenswrapper[5116]: I0322 00:23:58.025285 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9b82b5e1637db60c44ebfe672bf8988270a6be31331481f933ad65e5b492b23"} err="failed to get container status \"a9b82b5e1637db60c44ebfe672bf8988270a6be31331481f933ad65e5b492b23\": rpc error: code = NotFound desc = could not find container \"a9b82b5e1637db60c44ebfe672bf8988270a6be31331481f933ad65e5b492b23\": container with ID starting with a9b82b5e1637db60c44ebfe672bf8988270a6be31331481f933ad65e5b492b23 not found: ID does not exist" Mar 22 00:23:58 crc kubenswrapper[5116]: I0322 00:23:58.145746 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-42pwh" Mar 22 00:23:58 crc kubenswrapper[5116]: I0322 00:23:58.583576 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-42pwh"] Mar 22 00:23:58 crc kubenswrapper[5116]: W0322 00:23:58.597300 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49647030_2576_4187_ba7e_d7514161f53d.slice/crio-25f386cffffaef0016eea677bfdd2ee1dc93c0140cb90fd72fcaf4a608c0464a WatchSource:0}: Error finding container 25f386cffffaef0016eea677bfdd2ee1dc93c0140cb90fd72fcaf4a608c0464a: Status 404 returned error can't find the container with id 25f386cffffaef0016eea677bfdd2ee1dc93c0140cb90fd72fcaf4a608c0464a Mar 22 00:23:58 crc kubenswrapper[5116]: I0322 00:23:58.599103 5116 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 22 00:23:58 crc kubenswrapper[5116]: I0322 00:23:58.960875 5116 generic.go:358] "Generic (PLEG): container finished" podID="49647030-2576-4187-ba7e-d7514161f53d" containerID="54526cfea93342666afe65f2e74e59e37e1224b18ad48bbbbfef34ea25a85af6" exitCode=0 Mar 22 00:23:58 crc kubenswrapper[5116]: I0322 00:23:58.960980 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-42pwh" event={"ID":"49647030-2576-4187-ba7e-d7514161f53d","Type":"ContainerDied","Data":"54526cfea93342666afe65f2e74e59e37e1224b18ad48bbbbfef34ea25a85af6"} Mar 22 00:23:58 crc kubenswrapper[5116]: I0322 00:23:58.961393 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-42pwh" event={"ID":"49647030-2576-4187-ba7e-d7514161f53d","Type":"ContainerStarted","Data":"25f386cffffaef0016eea677bfdd2ee1dc93c0140cb90fd72fcaf4a608c0464a"} Mar 22 00:23:59 crc kubenswrapper[5116]: I0322 00:23:59.709899 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7644c42a-78ae-476f-84f8-2f1d9372f921" path="/var/lib/kubelet/pods/7644c42a-78ae-476f-84f8-2f1d9372f921/volumes" Mar 22 00:23:59 crc kubenswrapper[5116]: I0322 00:23:59.972669 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-42pwh" event={"ID":"49647030-2576-4187-ba7e-d7514161f53d","Type":"ContainerStarted","Data":"2b2a78638fc6aafc7314ab3d3c208dcc214f4792249aca96fd548ab4f628b429"} Mar 22 00:24:00 crc kubenswrapper[5116]: I0322 00:24:00.135159 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29568984-7vpl2"] Mar 22 00:24:00 crc kubenswrapper[5116]: I0322 00:24:00.141325 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568984-7vpl2" Mar 22 00:24:00 crc kubenswrapper[5116]: I0322 00:24:00.143227 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-zsw2q\"" Mar 22 00:24:00 crc kubenswrapper[5116]: I0322 00:24:00.143606 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 22 00:24:00 crc kubenswrapper[5116]: I0322 00:24:00.144008 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 22 00:24:00 crc kubenswrapper[5116]: I0322 00:24:00.152225 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568984-7vpl2"] Mar 22 00:24:00 crc kubenswrapper[5116]: I0322 00:24:00.201608 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt5rx\" (UniqueName: \"kubernetes.io/projected/dd880bf8-6058-4924-8268-c4cdcd44bdcf-kube-api-access-mt5rx\") pod \"auto-csr-approver-29568984-7vpl2\" (UID: \"dd880bf8-6058-4924-8268-c4cdcd44bdcf\") " pod="openshift-infra/auto-csr-approver-29568984-7vpl2" Mar 22 00:24:00 crc kubenswrapper[5116]: I0322 00:24:00.303134 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mt5rx\" (UniqueName: \"kubernetes.io/projected/dd880bf8-6058-4924-8268-c4cdcd44bdcf-kube-api-access-mt5rx\") pod \"auto-csr-approver-29568984-7vpl2\" (UID: \"dd880bf8-6058-4924-8268-c4cdcd44bdcf\") " pod="openshift-infra/auto-csr-approver-29568984-7vpl2" Mar 22 00:24:00 crc kubenswrapper[5116]: I0322 00:24:00.321614 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt5rx\" (UniqueName: \"kubernetes.io/projected/dd880bf8-6058-4924-8268-c4cdcd44bdcf-kube-api-access-mt5rx\") pod \"auto-csr-approver-29568984-7vpl2\" (UID: \"dd880bf8-6058-4924-8268-c4cdcd44bdcf\") " pod="openshift-infra/auto-csr-approver-29568984-7vpl2" Mar 22 00:24:00 crc kubenswrapper[5116]: I0322 00:24:00.505841 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568984-7vpl2" Mar 22 00:24:00 crc kubenswrapper[5116]: I0322 00:24:00.698774 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568984-7vpl2"] Mar 22 00:24:00 crc kubenswrapper[5116]: I0322 00:24:00.981082 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568984-7vpl2" event={"ID":"dd880bf8-6058-4924-8268-c4cdcd44bdcf","Type":"ContainerStarted","Data":"2efa21c0a91293e007e8e38af7c5a1638c6460f90d946780cdea26aa0fa32a0d"} Mar 22 00:24:00 crc kubenswrapper[5116]: I0322 00:24:00.982806 5116 generic.go:358] "Generic (PLEG): container finished" podID="49647030-2576-4187-ba7e-d7514161f53d" containerID="2b2a78638fc6aafc7314ab3d3c208dcc214f4792249aca96fd548ab4f628b429" exitCode=0 Mar 22 00:24:00 crc kubenswrapper[5116]: I0322 00:24:00.982974 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-42pwh" event={"ID":"49647030-2576-4187-ba7e-d7514161f53d","Type":"ContainerDied","Data":"2b2a78638fc6aafc7314ab3d3c208dcc214f4792249aca96fd548ab4f628b429"} Mar 22 00:24:02 crc kubenswrapper[5116]: I0322 00:24:02.007718 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-42pwh" event={"ID":"49647030-2576-4187-ba7e-d7514161f53d","Type":"ContainerStarted","Data":"f8afc86015a284985c9b4eb7a38cee7fb97abb04ba3be621c6437a9078f6747c"} Mar 22 00:24:02 crc kubenswrapper[5116]: I0322 00:24:02.011316 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568984-7vpl2" event={"ID":"dd880bf8-6058-4924-8268-c4cdcd44bdcf","Type":"ContainerStarted","Data":"c55d0c630755e42e331267ab717de759a85e99a7760f057cab2cc5e7dd612af4"} Mar 22 00:24:02 crc kubenswrapper[5116]: I0322 00:24:02.038645 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29568984-7vpl2" podStartSLOduration=1.091766324 podStartE2EDuration="2.038628385s" podCreationTimestamp="2026-03-22 00:24:00 +0000 UTC" firstStartedPulling="2026-03-22 00:24:00.706244446 +0000 UTC m=+911.728545819" lastFinishedPulling="2026-03-22 00:24:01.653106497 +0000 UTC m=+912.675407880" observedRunningTime="2026-03-22 00:24:02.037994064 +0000 UTC m=+913.060295437" watchObservedRunningTime="2026-03-22 00:24:02.038628385 +0000 UTC m=+913.060929748" Mar 22 00:24:02 crc kubenswrapper[5116]: I0322 00:24:02.043072 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-42pwh" podStartSLOduration=4.214725348 podStartE2EDuration="5.043056461s" podCreationTimestamp="2026-03-22 00:23:57 +0000 UTC" firstStartedPulling="2026-03-22 00:23:58.96272189 +0000 UTC m=+909.985023303" lastFinishedPulling="2026-03-22 00:23:59.791053033 +0000 UTC m=+910.813354416" observedRunningTime="2026-03-22 00:24:02.025256253 +0000 UTC m=+913.047557636" watchObservedRunningTime="2026-03-22 00:24:02.043056461 +0000 UTC m=+913.065357834" Mar 22 00:24:03 crc kubenswrapper[5116]: I0322 00:24:03.019972 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568984-7vpl2" event={"ID":"dd880bf8-6058-4924-8268-c4cdcd44bdcf","Type":"ContainerDied","Data":"c55d0c630755e42e331267ab717de759a85e99a7760f057cab2cc5e7dd612af4"} Mar 22 00:24:03 crc kubenswrapper[5116]: I0322 00:24:03.019874 5116 generic.go:358] "Generic (PLEG): container finished" podID="dd880bf8-6058-4924-8268-c4cdcd44bdcf" containerID="c55d0c630755e42e331267ab717de759a85e99a7760f057cab2cc5e7dd612af4" exitCode=0 Mar 22 00:24:04 crc kubenswrapper[5116]: I0322 00:24:04.316038 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568984-7vpl2" Mar 22 00:24:04 crc kubenswrapper[5116]: I0322 00:24:04.461356 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt5rx\" (UniqueName: \"kubernetes.io/projected/dd880bf8-6058-4924-8268-c4cdcd44bdcf-kube-api-access-mt5rx\") pod \"dd880bf8-6058-4924-8268-c4cdcd44bdcf\" (UID: \"dd880bf8-6058-4924-8268-c4cdcd44bdcf\") " Mar 22 00:24:04 crc kubenswrapper[5116]: I0322 00:24:04.470543 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd880bf8-6058-4924-8268-c4cdcd44bdcf-kube-api-access-mt5rx" (OuterVolumeSpecName: "kube-api-access-mt5rx") pod "dd880bf8-6058-4924-8268-c4cdcd44bdcf" (UID: "dd880bf8-6058-4924-8268-c4cdcd44bdcf"). InnerVolumeSpecName "kube-api-access-mt5rx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:24:04 crc kubenswrapper[5116]: I0322 00:24:04.566460 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mt5rx\" (UniqueName: \"kubernetes.io/projected/dd880bf8-6058-4924-8268-c4cdcd44bdcf-kube-api-access-mt5rx\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:05 crc kubenswrapper[5116]: I0322 00:24:05.044206 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568984-7vpl2" event={"ID":"dd880bf8-6058-4924-8268-c4cdcd44bdcf","Type":"ContainerDied","Data":"2efa21c0a91293e007e8e38af7c5a1638c6460f90d946780cdea26aa0fa32a0d"} Mar 22 00:24:05 crc kubenswrapper[5116]: I0322 00:24:05.044263 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568984-7vpl2" Mar 22 00:24:05 crc kubenswrapper[5116]: I0322 00:24:05.044276 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2efa21c0a91293e007e8e38af7c5a1638c6460f90d946780cdea26aa0fa32a0d" Mar 22 00:24:05 crc kubenswrapper[5116]: I0322 00:24:05.390195 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29568978-vkbll"] Mar 22 00:24:05 crc kubenswrapper[5116]: I0322 00:24:05.396334 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29568978-vkbll"] Mar 22 00:24:05 crc kubenswrapper[5116]: I0322 00:24:05.708808 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07ada1f6-f713-45cb-8230-b9a2d89878ab" path="/var/lib/kubelet/pods/07ada1f6-f713-45cb-8230-b9a2d89878ab/volumes" Mar 22 00:24:08 crc kubenswrapper[5116]: I0322 00:24:08.146092 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/certified-operators-42pwh" Mar 22 00:24:08 crc kubenswrapper[5116]: I0322 00:24:08.147050 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-42pwh" Mar 22 00:24:08 crc kubenswrapper[5116]: I0322 00:24:08.197682 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-42pwh" Mar 22 00:24:09 crc kubenswrapper[5116]: I0322 00:24:09.114823 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-42pwh" Mar 22 00:24:09 crc kubenswrapper[5116]: I0322 00:24:09.166369 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-42pwh"] Mar 22 00:24:11 crc kubenswrapper[5116]: I0322 00:24:11.085811 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-42pwh" podUID="49647030-2576-4187-ba7e-d7514161f53d" containerName="registry-server" containerID="cri-o://f8afc86015a284985c9b4eb7a38cee7fb97abb04ba3be621c6437a9078f6747c" gracePeriod=2 Mar 22 00:24:12 crc kubenswrapper[5116]: I0322 00:24:12.096231 5116 generic.go:358] "Generic (PLEG): container finished" podID="49647030-2576-4187-ba7e-d7514161f53d" containerID="f8afc86015a284985c9b4eb7a38cee7fb97abb04ba3be621c6437a9078f6747c" exitCode=0 Mar 22 00:24:12 crc kubenswrapper[5116]: I0322 00:24:12.096441 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-42pwh" event={"ID":"49647030-2576-4187-ba7e-d7514161f53d","Type":"ContainerDied","Data":"f8afc86015a284985c9b4eb7a38cee7fb97abb04ba3be621c6437a9078f6747c"} Mar 22 00:24:12 crc kubenswrapper[5116]: I0322 00:24:12.552129 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-42pwh" Mar 22 00:24:12 crc kubenswrapper[5116]: I0322 00:24:12.596161 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4qqh\" (UniqueName: \"kubernetes.io/projected/49647030-2576-4187-ba7e-d7514161f53d-kube-api-access-j4qqh\") pod \"49647030-2576-4187-ba7e-d7514161f53d\" (UID: \"49647030-2576-4187-ba7e-d7514161f53d\") " Mar 22 00:24:12 crc kubenswrapper[5116]: I0322 00:24:12.596258 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49647030-2576-4187-ba7e-d7514161f53d-utilities\") pod \"49647030-2576-4187-ba7e-d7514161f53d\" (UID: \"49647030-2576-4187-ba7e-d7514161f53d\") " Mar 22 00:24:12 crc kubenswrapper[5116]: I0322 00:24:12.596308 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49647030-2576-4187-ba7e-d7514161f53d-catalog-content\") pod \"49647030-2576-4187-ba7e-d7514161f53d\" (UID: \"49647030-2576-4187-ba7e-d7514161f53d\") " Mar 22 00:24:12 crc kubenswrapper[5116]: I0322 00:24:12.598192 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49647030-2576-4187-ba7e-d7514161f53d-utilities" (OuterVolumeSpecName: "utilities") pod "49647030-2576-4187-ba7e-d7514161f53d" (UID: "49647030-2576-4187-ba7e-d7514161f53d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:24:12 crc kubenswrapper[5116]: I0322 00:24:12.606796 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49647030-2576-4187-ba7e-d7514161f53d-kube-api-access-j4qqh" (OuterVolumeSpecName: "kube-api-access-j4qqh") pod "49647030-2576-4187-ba7e-d7514161f53d" (UID: "49647030-2576-4187-ba7e-d7514161f53d"). InnerVolumeSpecName "kube-api-access-j4qqh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:24:12 crc kubenswrapper[5116]: I0322 00:24:12.637548 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49647030-2576-4187-ba7e-d7514161f53d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49647030-2576-4187-ba7e-d7514161f53d" (UID: "49647030-2576-4187-ba7e-d7514161f53d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:24:12 crc kubenswrapper[5116]: I0322 00:24:12.697196 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49647030-2576-4187-ba7e-d7514161f53d-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:12 crc kubenswrapper[5116]: I0322 00:24:12.697261 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49647030-2576-4187-ba7e-d7514161f53d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:12 crc kubenswrapper[5116]: I0322 00:24:12.697284 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j4qqh\" (UniqueName: \"kubernetes.io/projected/49647030-2576-4187-ba7e-d7514161f53d-kube-api-access-j4qqh\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:13 crc kubenswrapper[5116]: I0322 00:24:13.107786 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-42pwh" event={"ID":"49647030-2576-4187-ba7e-d7514161f53d","Type":"ContainerDied","Data":"25f386cffffaef0016eea677bfdd2ee1dc93c0140cb90fd72fcaf4a608c0464a"} Mar 22 00:24:13 crc kubenswrapper[5116]: I0322 00:24:13.108591 5116 scope.go:117] "RemoveContainer" containerID="f8afc86015a284985c9b4eb7a38cee7fb97abb04ba3be621c6437a9078f6747c" Mar 22 00:24:13 crc kubenswrapper[5116]: I0322 00:24:13.107900 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-42pwh" Mar 22 00:24:13 crc kubenswrapper[5116]: I0322 00:24:13.130737 5116 scope.go:117] "RemoveContainer" containerID="2b2a78638fc6aafc7314ab3d3c208dcc214f4792249aca96fd548ab4f628b429" Mar 22 00:24:13 crc kubenswrapper[5116]: I0322 00:24:13.147984 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-42pwh"] Mar 22 00:24:13 crc kubenswrapper[5116]: I0322 00:24:13.149990 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-42pwh"] Mar 22 00:24:13 crc kubenswrapper[5116]: I0322 00:24:13.174258 5116 scope.go:117] "RemoveContainer" containerID="54526cfea93342666afe65f2e74e59e37e1224b18ad48bbbbfef34ea25a85af6" Mar 22 00:24:13 crc kubenswrapper[5116]: I0322 00:24:13.712146 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49647030-2576-4187-ba7e-d7514161f53d" path="/var/lib/kubelet/pods/49647030-2576-4187-ba7e-d7514161f53d/volumes" Mar 22 00:24:23 crc kubenswrapper[5116]: I0322 00:24:23.057503 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:24:23 crc kubenswrapper[5116]: I0322 00:24:23.057941 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:24:41 crc kubenswrapper[5116]: I0322 00:24:41.293127 5116 generic.go:358] "Generic (PLEG): container finished" podID="8a3c0606-0188-415f-9d3f-b6477cda110e" containerID="3a936d4064aa62dbd3538b2212d49894af70791a2665eec39925b6255ecb5c6b" exitCode=0 Mar 22 00:24:41 crc kubenswrapper[5116]: I0322 00:24:41.293183 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"8a3c0606-0188-415f-9d3f-b6477cda110e","Type":"ContainerDied","Data":"3a936d4064aa62dbd3538b2212d49894af70791a2665eec39925b6255ecb5c6b"} Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.569519 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.638297 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-container-storage-root\") pod \"8a3c0606-0188-415f-9d3f-b6477cda110e\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.638363 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/8a3c0606-0188-415f-9d3f-b6477cda110e-builder-dockercfg-qv5f4-push\") pod \"8a3c0606-0188-415f-9d3f-b6477cda110e\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.638396 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-container-storage-run\") pod \"8a3c0606-0188-415f-9d3f-b6477cda110e\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.638417 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-proxy-ca-bundles\") pod \"8a3c0606-0188-415f-9d3f-b6477cda110e\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.638434 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-system-configs\") pod \"8a3c0606-0188-415f-9d3f-b6477cda110e\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.638450 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/8a3c0606-0188-415f-9d3f-b6477cda110e-builder-dockercfg-qv5f4-pull\") pod \"8a3c0606-0188-415f-9d3f-b6477cda110e\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.638534 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2lsb\" (UniqueName: \"kubernetes.io/projected/8a3c0606-0188-415f-9d3f-b6477cda110e-kube-api-access-n2lsb\") pod \"8a3c0606-0188-415f-9d3f-b6477cda110e\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.638577 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8a3c0606-0188-415f-9d3f-b6477cda110e-buildcachedir\") pod \"8a3c0606-0188-415f-9d3f-b6477cda110e\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.638597 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8a3c0606-0188-415f-9d3f-b6477cda110e-node-pullsecrets\") pod \"8a3c0606-0188-415f-9d3f-b6477cda110e\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.638617 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-ca-bundles\") pod \"8a3c0606-0188-415f-9d3f-b6477cda110e\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.638635 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-buildworkdir\") pod \"8a3c0606-0188-415f-9d3f-b6477cda110e\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.638675 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-build-blob-cache\") pod \"8a3c0606-0188-415f-9d3f-b6477cda110e\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.639379 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a3c0606-0188-415f-9d3f-b6477cda110e-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "8a3c0606-0188-415f-9d3f-b6477cda110e" (UID: "8a3c0606-0188-415f-9d3f-b6477cda110e"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.639436 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a3c0606-0188-415f-9d3f-b6477cda110e-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "8a3c0606-0188-415f-9d3f-b6477cda110e" (UID: "8a3c0606-0188-415f-9d3f-b6477cda110e"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.640416 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "8a3c0606-0188-415f-9d3f-b6477cda110e" (UID: "8a3c0606-0188-415f-9d3f-b6477cda110e"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.640490 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "8a3c0606-0188-415f-9d3f-b6477cda110e" (UID: "8a3c0606-0188-415f-9d3f-b6477cda110e"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.640530 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "8a3c0606-0188-415f-9d3f-b6477cda110e" (UID: "8a3c0606-0188-415f-9d3f-b6477cda110e"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.642147 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "8a3c0606-0188-415f-9d3f-b6477cda110e" (UID: "8a3c0606-0188-415f-9d3f-b6477cda110e"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.644609 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "8a3c0606-0188-415f-9d3f-b6477cda110e" (UID: "8a3c0606-0188-415f-9d3f-b6477cda110e"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.647304 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a3c0606-0188-415f-9d3f-b6477cda110e-builder-dockercfg-qv5f4-pull" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-pull") pod "8a3c0606-0188-415f-9d3f-b6477cda110e" (UID: "8a3c0606-0188-415f-9d3f-b6477cda110e"). InnerVolumeSpecName "builder-dockercfg-qv5f4-pull". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.647322 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a3c0606-0188-415f-9d3f-b6477cda110e-builder-dockercfg-qv5f4-push" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-push") pod "8a3c0606-0188-415f-9d3f-b6477cda110e" (UID: "8a3c0606-0188-415f-9d3f-b6477cda110e"). InnerVolumeSpecName "builder-dockercfg-qv5f4-push". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.647357 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a3c0606-0188-415f-9d3f-b6477cda110e-kube-api-access-n2lsb" (OuterVolumeSpecName: "kube-api-access-n2lsb") pod "8a3c0606-0188-415f-9d3f-b6477cda110e" (UID: "8a3c0606-0188-415f-9d3f-b6477cda110e"). InnerVolumeSpecName "kube-api-access-n2lsb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.739760 5116 reconciler_common.go:299] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.739977 5116 reconciler_common.go:299] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.740040 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/8a3c0606-0188-415f-9d3f-b6477cda110e-builder-dockercfg-qv5f4-pull\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.740099 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n2lsb\" (UniqueName: \"kubernetes.io/projected/8a3c0606-0188-415f-9d3f-b6477cda110e-kube-api-access-n2lsb\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.740156 5116 reconciler_common.go:299] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8a3c0606-0188-415f-9d3f-b6477cda110e-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.740249 5116 reconciler_common.go:299] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8a3c0606-0188-415f-9d3f-b6477cda110e-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.740312 5116 reconciler_common.go:299] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.740370 5116 reconciler_common.go:299] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.740428 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/8a3c0606-0188-415f-9d3f-b6477cda110e-builder-dockercfg-qv5f4-push\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.740484 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.815423 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "8a3c0606-0188-415f-9d3f-b6477cda110e" (UID: "8a3c0606-0188-415f-9d3f-b6477cda110e"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.841628 5116 reconciler_common.go:299] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:43 crc kubenswrapper[5116]: I0322 00:24:43.312999 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"8a3c0606-0188-415f-9d3f-b6477cda110e","Type":"ContainerDied","Data":"ee2acea936ad827175b72459e0f3f311a25df509cce538719e55f747534cdcb7"} Mar 22 00:24:43 crc kubenswrapper[5116]: I0322 00:24:43.313273 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee2acea936ad827175b72459e0f3f311a25df509cce538719e55f747534cdcb7" Mar 22 00:24:43 crc kubenswrapper[5116]: I0322 00:24:43.313275 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:24:44 crc kubenswrapper[5116]: I0322 00:24:44.592830 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "8a3c0606-0188-415f-9d3f-b6477cda110e" (UID: "8a3c0606-0188-415f-9d3f-b6477cda110e"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:24:44 crc kubenswrapper[5116]: I0322 00:24:44.668254 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.110474 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.111640 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49647030-2576-4187-ba7e-d7514161f53d" containerName="extract-utilities" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.111905 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="49647030-2576-4187-ba7e-d7514161f53d" containerName="extract-utilities" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.111924 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a3c0606-0188-415f-9d3f-b6477cda110e" containerName="manage-dockerfile" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.111932 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a3c0606-0188-415f-9d3f-b6477cda110e" containerName="manage-dockerfile" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.111959 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a3c0606-0188-415f-9d3f-b6477cda110e" containerName="docker-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.111967 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a3c0606-0188-415f-9d3f-b6477cda110e" containerName="docker-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.111980 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49647030-2576-4187-ba7e-d7514161f53d" containerName="registry-server" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.111988 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="49647030-2576-4187-ba7e-d7514161f53d" containerName="registry-server" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.112009 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a3c0606-0188-415f-9d3f-b6477cda110e" containerName="git-clone" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.112016 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a3c0606-0188-415f-9d3f-b6477cda110e" containerName="git-clone" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.112024 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd880bf8-6058-4924-8268-c4cdcd44bdcf" containerName="oc" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.112030 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd880bf8-6058-4924-8268-c4cdcd44bdcf" containerName="oc" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.112041 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49647030-2576-4187-ba7e-d7514161f53d" containerName="extract-content" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.112048 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="49647030-2576-4187-ba7e-d7514161f53d" containerName="extract-content" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.112186 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd880bf8-6058-4924-8268-c4cdcd44bdcf" containerName="oc" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.112205 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="49647030-2576-4187-ba7e-d7514161f53d" containerName="registry-server" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.112215 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="8a3c0606-0188-415f-9d3f-b6477cda110e" containerName="docker-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.118832 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.124422 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"sg-core-1-global-ca\"" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.124434 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"sg-core-1-ca\"" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.124427 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"sg-core-1-sys-config\"" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.124976 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"builder-dockercfg-qv5f4\"" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.127817 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.201793 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/485bf248-0704-4af5-b4b4-f349855d45a7-builder-dockercfg-qv5f4-pull\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.201849 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.201872 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/485bf248-0704-4af5-b4b4-f349855d45a7-builder-dockercfg-qv5f4-push\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.201907 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.202036 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-container-storage-run\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.202083 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.202134 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-system-configs\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.202180 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-buildworkdir\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.202212 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2nqt\" (UniqueName: \"kubernetes.io/projected/485bf248-0704-4af5-b4b4-f349855d45a7-kube-api-access-t2nqt\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.202255 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-container-storage-root\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.202278 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/485bf248-0704-4af5-b4b4-f349855d45a7-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.202335 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/485bf248-0704-4af5-b4b4-f349855d45a7-buildcachedir\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.304190 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-container-storage-run\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.304249 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.304305 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-system-configs\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.304333 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-buildworkdir\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.304356 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t2nqt\" (UniqueName: \"kubernetes.io/projected/485bf248-0704-4af5-b4b4-f349855d45a7-kube-api-access-t2nqt\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.304583 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-container-storage-root\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.304643 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/485bf248-0704-4af5-b4b4-f349855d45a7-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.304876 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/485bf248-0704-4af5-b4b4-f349855d45a7-buildcachedir\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.305013 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/485bf248-0704-4af5-b4b4-f349855d45a7-builder-dockercfg-qv5f4-pull\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.305066 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.305081 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-buildworkdir\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.305099 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/485bf248-0704-4af5-b4b4-f349855d45a7-builder-dockercfg-qv5f4-push\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.305143 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.305250 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/485bf248-0704-4af5-b4b4-f349855d45a7-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.305553 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-container-storage-run\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.305631 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.305723 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/485bf248-0704-4af5-b4b4-f349855d45a7-buildcachedir\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.305812 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-container-storage-root\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.306109 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.306149 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.306224 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-system-configs\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.313527 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/485bf248-0704-4af5-b4b4-f349855d45a7-builder-dockercfg-qv5f4-pull\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.314693 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/485bf248-0704-4af5-b4b4-f349855d45a7-builder-dockercfg-qv5f4-push\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.324149 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2nqt\" (UniqueName: \"kubernetes.io/projected/485bf248-0704-4af5-b4b4-f349855d45a7-kube-api-access-t2nqt\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.434354 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.632806 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 22 00:24:47 crc kubenswrapper[5116]: W0322 00:24:47.639387 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod485bf248_0704_4af5_b4b4_f349855d45a7.slice/crio-2190e9378e89191d390f2c65c4c525c23aef0fb302f3a2e896acf05cc78d5a0e WatchSource:0}: Error finding container 2190e9378e89191d390f2c65c4c525c23aef0fb302f3a2e896acf05cc78d5a0e: Status 404 returned error can't find the container with id 2190e9378e89191d390f2c65c4c525c23aef0fb302f3a2e896acf05cc78d5a0e Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.826665 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-97d7p"] Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.832782 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-97d7p" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.836519 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-97d7p"] Mar 22 00:24:48 crc kubenswrapper[5116]: I0322 00:24:48.016150 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-catalog-content\") pod \"redhat-operators-97d7p\" (UID: \"3fadb3c9-a4dd-44da-8b81-a8a3be611d61\") " pod="openshift-marketplace/redhat-operators-97d7p" Mar 22 00:24:48 crc kubenswrapper[5116]: I0322 00:24:48.016303 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-utilities\") pod \"redhat-operators-97d7p\" (UID: \"3fadb3c9-a4dd-44da-8b81-a8a3be611d61\") " pod="openshift-marketplace/redhat-operators-97d7p" Mar 22 00:24:48 crc kubenswrapper[5116]: I0322 00:24:48.016367 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zlzs\" (UniqueName: \"kubernetes.io/projected/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-kube-api-access-6zlzs\") pod \"redhat-operators-97d7p\" (UID: \"3fadb3c9-a4dd-44da-8b81-a8a3be611d61\") " pod="openshift-marketplace/redhat-operators-97d7p" Mar 22 00:24:48 crc kubenswrapper[5116]: I0322 00:24:48.117869 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-catalog-content\") pod \"redhat-operators-97d7p\" (UID: \"3fadb3c9-a4dd-44da-8b81-a8a3be611d61\") " pod="openshift-marketplace/redhat-operators-97d7p" Mar 22 00:24:48 crc kubenswrapper[5116]: I0322 00:24:48.117932 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-utilities\") pod \"redhat-operators-97d7p\" (UID: \"3fadb3c9-a4dd-44da-8b81-a8a3be611d61\") " pod="openshift-marketplace/redhat-operators-97d7p" Mar 22 00:24:48 crc kubenswrapper[5116]: I0322 00:24:48.117972 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6zlzs\" (UniqueName: \"kubernetes.io/projected/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-kube-api-access-6zlzs\") pod \"redhat-operators-97d7p\" (UID: \"3fadb3c9-a4dd-44da-8b81-a8a3be611d61\") " pod="openshift-marketplace/redhat-operators-97d7p" Mar 22 00:24:48 crc kubenswrapper[5116]: I0322 00:24:48.118427 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-catalog-content\") pod \"redhat-operators-97d7p\" (UID: \"3fadb3c9-a4dd-44da-8b81-a8a3be611d61\") " pod="openshift-marketplace/redhat-operators-97d7p" Mar 22 00:24:48 crc kubenswrapper[5116]: I0322 00:24:48.118501 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-utilities\") pod \"redhat-operators-97d7p\" (UID: \"3fadb3c9-a4dd-44da-8b81-a8a3be611d61\") " pod="openshift-marketplace/redhat-operators-97d7p" Mar 22 00:24:48 crc kubenswrapper[5116]: I0322 00:24:48.137315 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zlzs\" (UniqueName: \"kubernetes.io/projected/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-kube-api-access-6zlzs\") pod \"redhat-operators-97d7p\" (UID: \"3fadb3c9-a4dd-44da-8b81-a8a3be611d61\") " pod="openshift-marketplace/redhat-operators-97d7p" Mar 22 00:24:48 crc kubenswrapper[5116]: I0322 00:24:48.156786 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-97d7p" Mar 22 00:24:48 crc kubenswrapper[5116]: I0322 00:24:48.362533 5116 generic.go:358] "Generic (PLEG): container finished" podID="485bf248-0704-4af5-b4b4-f349855d45a7" containerID="97245c00e03ada14054ae81897b434bb5516c5f9efd3c8f7bd51ade6f20bbcbf" exitCode=0 Mar 22 00:24:48 crc kubenswrapper[5116]: I0322 00:24:48.362798 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"485bf248-0704-4af5-b4b4-f349855d45a7","Type":"ContainerDied","Data":"97245c00e03ada14054ae81897b434bb5516c5f9efd3c8f7bd51ade6f20bbcbf"} Mar 22 00:24:48 crc kubenswrapper[5116]: I0322 00:24:48.362845 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"485bf248-0704-4af5-b4b4-f349855d45a7","Type":"ContainerStarted","Data":"2190e9378e89191d390f2c65c4c525c23aef0fb302f3a2e896acf05cc78d5a0e"} Mar 22 00:24:48 crc kubenswrapper[5116]: I0322 00:24:48.383245 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-97d7p"] Mar 22 00:24:49 crc kubenswrapper[5116]: I0322 00:24:49.373343 5116 generic.go:358] "Generic (PLEG): container finished" podID="3fadb3c9-a4dd-44da-8b81-a8a3be611d61" containerID="cf9c4e6e32cbba6521573cad17d8e29d39451985a3225355720a084120608907" exitCode=0 Mar 22 00:24:49 crc kubenswrapper[5116]: I0322 00:24:49.373409 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97d7p" event={"ID":"3fadb3c9-a4dd-44da-8b81-a8a3be611d61","Type":"ContainerDied","Data":"cf9c4e6e32cbba6521573cad17d8e29d39451985a3225355720a084120608907"} Mar 22 00:24:49 crc kubenswrapper[5116]: I0322 00:24:49.374070 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97d7p" event={"ID":"3fadb3c9-a4dd-44da-8b81-a8a3be611d61","Type":"ContainerStarted","Data":"26cf5ed539e64f0184332ca616fb1f61e216f05f16c7ffbf5dede9ab4123f429"} Mar 22 00:24:49 crc kubenswrapper[5116]: I0322 00:24:49.376516 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"485bf248-0704-4af5-b4b4-f349855d45a7","Type":"ContainerStarted","Data":"9e1bc2820c8eafe581010f0311e28903def8ce583e1045d21ad870f141429447"} Mar 22 00:24:49 crc kubenswrapper[5116]: I0322 00:24:49.427634 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-1-build" podStartSLOduration=2.427618304 podStartE2EDuration="2.427618304s" podCreationTimestamp="2026-03-22 00:24:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:24:49.426426086 +0000 UTC m=+960.448727459" watchObservedRunningTime="2026-03-22 00:24:49.427618304 +0000 UTC m=+960.449919677" Mar 22 00:24:50 crc kubenswrapper[5116]: I0322 00:24:50.556270 5116 scope.go:117] "RemoveContainer" containerID="208d35041a700bcc47fefb636464fe18464c55b6addf5e55a5a1888e5fa3efb2" Mar 22 00:24:51 crc kubenswrapper[5116]: I0322 00:24:51.393243 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97d7p" event={"ID":"3fadb3c9-a4dd-44da-8b81-a8a3be611d61","Type":"ContainerStarted","Data":"6641078c6ff5afeffcd6068753df9300c0519023b5a92376b99a9a5831718906"} Mar 22 00:24:52 crc kubenswrapper[5116]: I0322 00:24:52.418246 5116 generic.go:358] "Generic (PLEG): container finished" podID="3fadb3c9-a4dd-44da-8b81-a8a3be611d61" containerID="6641078c6ff5afeffcd6068753df9300c0519023b5a92376b99a9a5831718906" exitCode=0 Mar 22 00:24:52 crc kubenswrapper[5116]: I0322 00:24:52.418316 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97d7p" event={"ID":"3fadb3c9-a4dd-44da-8b81-a8a3be611d61","Type":"ContainerDied","Data":"6641078c6ff5afeffcd6068753df9300c0519023b5a92376b99a9a5831718906"} Mar 22 00:24:53 crc kubenswrapper[5116]: I0322 00:24:53.056850 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:24:53 crc kubenswrapper[5116]: I0322 00:24:53.057253 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:24:53 crc kubenswrapper[5116]: I0322 00:24:53.426507 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97d7p" event={"ID":"3fadb3c9-a4dd-44da-8b81-a8a3be611d61","Type":"ContainerStarted","Data":"84c85179860da6ce37f4f4dc3aef0ac07cd5b4bce1e288b876edc160e15f701c"} Mar 22 00:24:53 crc kubenswrapper[5116]: I0322 00:24:53.449328 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-97d7p" podStartSLOduration=5.423232879 podStartE2EDuration="6.449306872s" podCreationTimestamp="2026-03-22 00:24:47 +0000 UTC" firstStartedPulling="2026-03-22 00:24:49.376500578 +0000 UTC m=+960.398801991" lastFinishedPulling="2026-03-22 00:24:50.402574571 +0000 UTC m=+961.424875984" observedRunningTime="2026-03-22 00:24:53.443220451 +0000 UTC m=+964.465521824" watchObservedRunningTime="2026-03-22 00:24:53.449306872 +0000 UTC m=+964.471608245" Mar 22 00:24:57 crc kubenswrapper[5116]: I0322 00:24:57.678142 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 22 00:24:57 crc kubenswrapper[5116]: I0322 00:24:57.679117 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="service-telemetry/sg-core-1-build" podUID="485bf248-0704-4af5-b4b4-f349855d45a7" containerName="docker-build" containerID="cri-o://9e1bc2820c8eafe581010f0311e28903def8ce583e1045d21ad870f141429447" gracePeriod=30 Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.101453 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_485bf248-0704-4af5-b4b4-f349855d45a7/docker-build/0.log" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.102451 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.157197 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-97d7p" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.157267 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-operators-97d7p" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.158432 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-container-storage-root\") pod \"485bf248-0704-4af5-b4b4-f349855d45a7\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.158522 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-system-configs\") pod \"485bf248-0704-4af5-b4b4-f349855d45a7\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.158577 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-container-storage-run\") pod \"485bf248-0704-4af5-b4b4-f349855d45a7\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.158637 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-ca-bundles\") pod \"485bf248-0704-4af5-b4b4-f349855d45a7\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.158675 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-buildworkdir\") pod \"485bf248-0704-4af5-b4b4-f349855d45a7\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.158695 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/485bf248-0704-4af5-b4b4-f349855d45a7-node-pullsecrets\") pod \"485bf248-0704-4af5-b4b4-f349855d45a7\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.158742 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2nqt\" (UniqueName: \"kubernetes.io/projected/485bf248-0704-4af5-b4b4-f349855d45a7-kube-api-access-t2nqt\") pod \"485bf248-0704-4af5-b4b4-f349855d45a7\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.158795 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-build-blob-cache\") pod \"485bf248-0704-4af5-b4b4-f349855d45a7\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.158833 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-proxy-ca-bundles\") pod \"485bf248-0704-4af5-b4b4-f349855d45a7\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.158887 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/485bf248-0704-4af5-b4b4-f349855d45a7-builder-dockercfg-qv5f4-push\") pod \"485bf248-0704-4af5-b4b4-f349855d45a7\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.158938 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/485bf248-0704-4af5-b4b4-f349855d45a7-buildcachedir\") pod \"485bf248-0704-4af5-b4b4-f349855d45a7\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.158968 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/485bf248-0704-4af5-b4b4-f349855d45a7-builder-dockercfg-qv5f4-pull\") pod \"485bf248-0704-4af5-b4b4-f349855d45a7\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.159090 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/485bf248-0704-4af5-b4b4-f349855d45a7-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "485bf248-0704-4af5-b4b4-f349855d45a7" (UID: "485bf248-0704-4af5-b4b4-f349855d45a7"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.159486 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "485bf248-0704-4af5-b4b4-f349855d45a7" (UID: "485bf248-0704-4af5-b4b4-f349855d45a7"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.159731 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "485bf248-0704-4af5-b4b4-f349855d45a7" (UID: "485bf248-0704-4af5-b4b4-f349855d45a7"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.159774 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/485bf248-0704-4af5-b4b4-f349855d45a7-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "485bf248-0704-4af5-b4b4-f349855d45a7" (UID: "485bf248-0704-4af5-b4b4-f349855d45a7"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.159845 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "485bf248-0704-4af5-b4b4-f349855d45a7" (UID: "485bf248-0704-4af5-b4b4-f349855d45a7"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.159859 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "485bf248-0704-4af5-b4b4-f349855d45a7" (UID: "485bf248-0704-4af5-b4b4-f349855d45a7"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.160406 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "485bf248-0704-4af5-b4b4-f349855d45a7" (UID: "485bf248-0704-4af5-b4b4-f349855d45a7"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.164812 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/485bf248-0704-4af5-b4b4-f349855d45a7-kube-api-access-t2nqt" (OuterVolumeSpecName: "kube-api-access-t2nqt") pod "485bf248-0704-4af5-b4b4-f349855d45a7" (UID: "485bf248-0704-4af5-b4b4-f349855d45a7"). InnerVolumeSpecName "kube-api-access-t2nqt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.166354 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/485bf248-0704-4af5-b4b4-f349855d45a7-builder-dockercfg-qv5f4-push" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-push") pod "485bf248-0704-4af5-b4b4-f349855d45a7" (UID: "485bf248-0704-4af5-b4b4-f349855d45a7"). InnerVolumeSpecName "builder-dockercfg-qv5f4-push". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.167253 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/485bf248-0704-4af5-b4b4-f349855d45a7-builder-dockercfg-qv5f4-pull" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-pull") pod "485bf248-0704-4af5-b4b4-f349855d45a7" (UID: "485bf248-0704-4af5-b4b4-f349855d45a7"). InnerVolumeSpecName "builder-dockercfg-qv5f4-pull". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.196328 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-97d7p" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.258390 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "485bf248-0704-4af5-b4b4-f349855d45a7" (UID: "485bf248-0704-4af5-b4b4-f349855d45a7"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.260211 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/485bf248-0704-4af5-b4b4-f349855d45a7-builder-dockercfg-qv5f4-pull\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.260245 5116 reconciler_common.go:299] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.260257 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.260271 5116 reconciler_common.go:299] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.260283 5116 reconciler_common.go:299] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.260292 5116 reconciler_common.go:299] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/485bf248-0704-4af5-b4b4-f349855d45a7-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.260300 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t2nqt\" (UniqueName: \"kubernetes.io/projected/485bf248-0704-4af5-b4b4-f349855d45a7-kube-api-access-t2nqt\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.260308 5116 reconciler_common.go:299] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.260316 5116 reconciler_common.go:299] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.260328 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/485bf248-0704-4af5-b4b4-f349855d45a7-builder-dockercfg-qv5f4-push\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.260339 5116 reconciler_common.go:299] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/485bf248-0704-4af5-b4b4-f349855d45a7-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.390074 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "485bf248-0704-4af5-b4b4-f349855d45a7" (UID: "485bf248-0704-4af5-b4b4-f349855d45a7"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.461861 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.463319 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_485bf248-0704-4af5-b4b4-f349855d45a7/docker-build/0.log" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.463876 5116 generic.go:358] "Generic (PLEG): container finished" podID="485bf248-0704-4af5-b4b4-f349855d45a7" containerID="9e1bc2820c8eafe581010f0311e28903def8ce583e1045d21ad870f141429447" exitCode=1 Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.464871 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.466320 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"485bf248-0704-4af5-b4b4-f349855d45a7","Type":"ContainerDied","Data":"9e1bc2820c8eafe581010f0311e28903def8ce583e1045d21ad870f141429447"} Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.466486 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"485bf248-0704-4af5-b4b4-f349855d45a7","Type":"ContainerDied","Data":"2190e9378e89191d390f2c65c4c525c23aef0fb302f3a2e896acf05cc78d5a0e"} Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.466563 5116 scope.go:117] "RemoveContainer" containerID="9e1bc2820c8eafe581010f0311e28903def8ce583e1045d21ad870f141429447" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.505466 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.513539 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-97d7p" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.516797 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.522662 5116 scope.go:117] "RemoveContainer" containerID="97245c00e03ada14054ae81897b434bb5516c5f9efd3c8f7bd51ade6f20bbcbf" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.558167 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-97d7p"] Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.611315 5116 scope.go:117] "RemoveContainer" containerID="9e1bc2820c8eafe581010f0311e28903def8ce583e1045d21ad870f141429447" Mar 22 00:24:58 crc kubenswrapper[5116]: E0322 00:24:58.611842 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e1bc2820c8eafe581010f0311e28903def8ce583e1045d21ad870f141429447\": container with ID starting with 9e1bc2820c8eafe581010f0311e28903def8ce583e1045d21ad870f141429447 not found: ID does not exist" containerID="9e1bc2820c8eafe581010f0311e28903def8ce583e1045d21ad870f141429447" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.611895 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e1bc2820c8eafe581010f0311e28903def8ce583e1045d21ad870f141429447"} err="failed to get container status \"9e1bc2820c8eafe581010f0311e28903def8ce583e1045d21ad870f141429447\": rpc error: code = NotFound desc = could not find container \"9e1bc2820c8eafe581010f0311e28903def8ce583e1045d21ad870f141429447\": container with ID starting with 9e1bc2820c8eafe581010f0311e28903def8ce583e1045d21ad870f141429447 not found: ID does not exist" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.611931 5116 scope.go:117] "RemoveContainer" containerID="97245c00e03ada14054ae81897b434bb5516c5f9efd3c8f7bd51ade6f20bbcbf" Mar 22 00:24:58 crc kubenswrapper[5116]: E0322 00:24:58.612226 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97245c00e03ada14054ae81897b434bb5516c5f9efd3c8f7bd51ade6f20bbcbf\": container with ID starting with 97245c00e03ada14054ae81897b434bb5516c5f9efd3c8f7bd51ade6f20bbcbf not found: ID does not exist" containerID="97245c00e03ada14054ae81897b434bb5516c5f9efd3c8f7bd51ade6f20bbcbf" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.612254 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97245c00e03ada14054ae81897b434bb5516c5f9efd3c8f7bd51ade6f20bbcbf"} err="failed to get container status \"97245c00e03ada14054ae81897b434bb5516c5f9efd3c8f7bd51ade6f20bbcbf\": rpc error: code = NotFound desc = could not find container \"97245c00e03ada14054ae81897b434bb5516c5f9efd3c8f7bd51ade6f20bbcbf\": container with ID starting with 97245c00e03ada14054ae81897b434bb5516c5f9efd3c8f7bd51ade6f20bbcbf not found: ID does not exist" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.292230 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.293378 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="485bf248-0704-4af5-b4b4-f349855d45a7" containerName="docker-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.293403 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="485bf248-0704-4af5-b4b4-f349855d45a7" containerName="docker-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.293432 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="485bf248-0704-4af5-b4b4-f349855d45a7" containerName="manage-dockerfile" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.293444 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="485bf248-0704-4af5-b4b4-f349855d45a7" containerName="manage-dockerfile" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.293644 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="485bf248-0704-4af5-b4b4-f349855d45a7" containerName="docker-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.349319 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.349460 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.352377 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"builder-dockercfg-qv5f4\"" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.352590 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"sg-core-2-sys-config\"" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.353292 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"sg-core-2-global-ca\"" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.353570 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"sg-core-2-ca\"" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.476696 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/94167a49-edda-444e-bb01-c8b24c818557-builder-dockercfg-qv5f4-pull\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.476755 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.476810 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/94167a49-edda-444e-bb01-c8b24c818557-builder-dockercfg-qv5f4-push\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.476832 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-container-storage-root\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.476867 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-container-storage-run\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.476894 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.476982 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwt49\" (UniqueName: \"kubernetes.io/projected/94167a49-edda-444e-bb01-c8b24c818557-kube-api-access-lwt49\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.477074 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-system-configs\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.477113 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-buildworkdir\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.477211 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/94167a49-edda-444e-bb01-c8b24c818557-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.477414 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.477454 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/94167a49-edda-444e-bb01-c8b24c818557-buildcachedir\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.578398 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.578465 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwt49\" (UniqueName: \"kubernetes.io/projected/94167a49-edda-444e-bb01-c8b24c818557-kube-api-access-lwt49\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.578503 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-system-configs\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.578528 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-buildworkdir\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.578592 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/94167a49-edda-444e-bb01-c8b24c818557-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.578883 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.578893 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.578993 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/94167a49-edda-444e-bb01-c8b24c818557-buildcachedir\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.579082 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/94167a49-edda-444e-bb01-c8b24c818557-buildcachedir\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.579263 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-buildworkdir\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.579335 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-system-configs\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.579442 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/94167a49-edda-444e-bb01-c8b24c818557-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.579480 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.579750 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/94167a49-edda-444e-bb01-c8b24c818557-builder-dockercfg-qv5f4-pull\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.579913 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.580048 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/94167a49-edda-444e-bb01-c8b24c818557-builder-dockercfg-qv5f4-push\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.580145 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-container-storage-root\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.580357 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-container-storage-run\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.580454 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-container-storage-root\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.580750 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-container-storage-run\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.581125 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.586811 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/94167a49-edda-444e-bb01-c8b24c818557-builder-dockercfg-qv5f4-pull\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.591659 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/94167a49-edda-444e-bb01-c8b24c818557-builder-dockercfg-qv5f4-push\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.596546 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwt49\" (UniqueName: \"kubernetes.io/projected/94167a49-edda-444e-bb01-c8b24c818557-kube-api-access-lwt49\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.664588 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.708854 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="485bf248-0704-4af5-b4b4-f349855d45a7" path="/var/lib/kubelet/pods/485bf248-0704-4af5-b4b4-f349855d45a7/volumes" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.893316 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 22 00:24:59 crc kubenswrapper[5116]: W0322 00:24:59.895854 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94167a49_edda_444e_bb01_c8b24c818557.slice/crio-53613d79a9b2b5b46d05a807ddfaa532608c11cacf61914f830a9347006a7459 WatchSource:0}: Error finding container 53613d79a9b2b5b46d05a807ddfaa532608c11cacf61914f830a9347006a7459: Status 404 returned error can't find the container with id 53613d79a9b2b5b46d05a807ddfaa532608c11cacf61914f830a9347006a7459 Mar 22 00:25:00 crc kubenswrapper[5116]: I0322 00:25:00.482987 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"94167a49-edda-444e-bb01-c8b24c818557","Type":"ContainerStarted","Data":"635157ab685b7378b40d6892c4266da1c8cb5d2f4d61edadd8e0cd2000e9691d"} Mar 22 00:25:00 crc kubenswrapper[5116]: I0322 00:25:00.483418 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"94167a49-edda-444e-bb01-c8b24c818557","Type":"ContainerStarted","Data":"53613d79a9b2b5b46d05a807ddfaa532608c11cacf61914f830a9347006a7459"} Mar 22 00:25:00 crc kubenswrapper[5116]: I0322 00:25:00.483372 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-97d7p" podUID="3fadb3c9-a4dd-44da-8b81-a8a3be611d61" containerName="registry-server" containerID="cri-o://84c85179860da6ce37f4f4dc3aef0ac07cd5b4bce1e288b876edc160e15f701c" gracePeriod=2 Mar 22 00:25:00 crc kubenswrapper[5116]: I0322 00:25:00.854806 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-97d7p" Mar 22 00:25:00 crc kubenswrapper[5116]: I0322 00:25:00.900280 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-utilities\") pod \"3fadb3c9-a4dd-44da-8b81-a8a3be611d61\" (UID: \"3fadb3c9-a4dd-44da-8b81-a8a3be611d61\") " Mar 22 00:25:00 crc kubenswrapper[5116]: I0322 00:25:00.900451 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-catalog-content\") pod \"3fadb3c9-a4dd-44da-8b81-a8a3be611d61\" (UID: \"3fadb3c9-a4dd-44da-8b81-a8a3be611d61\") " Mar 22 00:25:00 crc kubenswrapper[5116]: I0322 00:25:00.900474 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zlzs\" (UniqueName: \"kubernetes.io/projected/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-kube-api-access-6zlzs\") pod \"3fadb3c9-a4dd-44da-8b81-a8a3be611d61\" (UID: \"3fadb3c9-a4dd-44da-8b81-a8a3be611d61\") " Mar 22 00:25:00 crc kubenswrapper[5116]: I0322 00:25:00.901394 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-utilities" (OuterVolumeSpecName: "utilities") pod "3fadb3c9-a4dd-44da-8b81-a8a3be611d61" (UID: "3fadb3c9-a4dd-44da-8b81-a8a3be611d61"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:25:00 crc kubenswrapper[5116]: I0322 00:25:00.909362 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-kube-api-access-6zlzs" (OuterVolumeSpecName: "kube-api-access-6zlzs") pod "3fadb3c9-a4dd-44da-8b81-a8a3be611d61" (UID: "3fadb3c9-a4dd-44da-8b81-a8a3be611d61"). InnerVolumeSpecName "kube-api-access-6zlzs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.002013 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6zlzs\" (UniqueName: \"kubernetes.io/projected/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-kube-api-access-6zlzs\") on node \"crc\" DevicePath \"\"" Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.002048 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.028907 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3fadb3c9-a4dd-44da-8b81-a8a3be611d61" (UID: "3fadb3c9-a4dd-44da-8b81-a8a3be611d61"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.102998 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.491312 5116 generic.go:358] "Generic (PLEG): container finished" podID="3fadb3c9-a4dd-44da-8b81-a8a3be611d61" containerID="84c85179860da6ce37f4f4dc3aef0ac07cd5b4bce1e288b876edc160e15f701c" exitCode=0 Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.491370 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97d7p" event={"ID":"3fadb3c9-a4dd-44da-8b81-a8a3be611d61","Type":"ContainerDied","Data":"84c85179860da6ce37f4f4dc3aef0ac07cd5b4bce1e288b876edc160e15f701c"} Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.491434 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97d7p" event={"ID":"3fadb3c9-a4dd-44da-8b81-a8a3be611d61","Type":"ContainerDied","Data":"26cf5ed539e64f0184332ca616fb1f61e216f05f16c7ffbf5dede9ab4123f429"} Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.491457 5116 scope.go:117] "RemoveContainer" containerID="84c85179860da6ce37f4f4dc3aef0ac07cd5b4bce1e288b876edc160e15f701c" Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.492357 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-97d7p" Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.493700 5116 generic.go:358] "Generic (PLEG): container finished" podID="94167a49-edda-444e-bb01-c8b24c818557" containerID="635157ab685b7378b40d6892c4266da1c8cb5d2f4d61edadd8e0cd2000e9691d" exitCode=0 Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.493883 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"94167a49-edda-444e-bb01-c8b24c818557","Type":"ContainerDied","Data":"635157ab685b7378b40d6892c4266da1c8cb5d2f4d61edadd8e0cd2000e9691d"} Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.517371 5116 scope.go:117] "RemoveContainer" containerID="6641078c6ff5afeffcd6068753df9300c0519023b5a92376b99a9a5831718906" Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.548229 5116 scope.go:117] "RemoveContainer" containerID="cf9c4e6e32cbba6521573cad17d8e29d39451985a3225355720a084120608907" Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.553441 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-97d7p"] Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.558892 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-97d7p"] Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.603309 5116 scope.go:117] "RemoveContainer" containerID="84c85179860da6ce37f4f4dc3aef0ac07cd5b4bce1e288b876edc160e15f701c" Mar 22 00:25:01 crc kubenswrapper[5116]: E0322 00:25:01.603614 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84c85179860da6ce37f4f4dc3aef0ac07cd5b4bce1e288b876edc160e15f701c\": container with ID starting with 84c85179860da6ce37f4f4dc3aef0ac07cd5b4bce1e288b876edc160e15f701c not found: ID does not exist" containerID="84c85179860da6ce37f4f4dc3aef0ac07cd5b4bce1e288b876edc160e15f701c" Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.603645 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84c85179860da6ce37f4f4dc3aef0ac07cd5b4bce1e288b876edc160e15f701c"} err="failed to get container status \"84c85179860da6ce37f4f4dc3aef0ac07cd5b4bce1e288b876edc160e15f701c\": rpc error: code = NotFound desc = could not find container \"84c85179860da6ce37f4f4dc3aef0ac07cd5b4bce1e288b876edc160e15f701c\": container with ID starting with 84c85179860da6ce37f4f4dc3aef0ac07cd5b4bce1e288b876edc160e15f701c not found: ID does not exist" Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.603662 5116 scope.go:117] "RemoveContainer" containerID="6641078c6ff5afeffcd6068753df9300c0519023b5a92376b99a9a5831718906" Mar 22 00:25:01 crc kubenswrapper[5116]: E0322 00:25:01.604434 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6641078c6ff5afeffcd6068753df9300c0519023b5a92376b99a9a5831718906\": container with ID starting with 6641078c6ff5afeffcd6068753df9300c0519023b5a92376b99a9a5831718906 not found: ID does not exist" containerID="6641078c6ff5afeffcd6068753df9300c0519023b5a92376b99a9a5831718906" Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.604455 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6641078c6ff5afeffcd6068753df9300c0519023b5a92376b99a9a5831718906"} err="failed to get container status \"6641078c6ff5afeffcd6068753df9300c0519023b5a92376b99a9a5831718906\": rpc error: code = NotFound desc = could not find container \"6641078c6ff5afeffcd6068753df9300c0519023b5a92376b99a9a5831718906\": container with ID starting with 6641078c6ff5afeffcd6068753df9300c0519023b5a92376b99a9a5831718906 not found: ID does not exist" Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.604467 5116 scope.go:117] "RemoveContainer" containerID="cf9c4e6e32cbba6521573cad17d8e29d39451985a3225355720a084120608907" Mar 22 00:25:01 crc kubenswrapper[5116]: E0322 00:25:01.604691 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf9c4e6e32cbba6521573cad17d8e29d39451985a3225355720a084120608907\": container with ID starting with cf9c4e6e32cbba6521573cad17d8e29d39451985a3225355720a084120608907 not found: ID does not exist" containerID="cf9c4e6e32cbba6521573cad17d8e29d39451985a3225355720a084120608907" Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.604712 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf9c4e6e32cbba6521573cad17d8e29d39451985a3225355720a084120608907"} err="failed to get container status \"cf9c4e6e32cbba6521573cad17d8e29d39451985a3225355720a084120608907\": rpc error: code = NotFound desc = could not find container \"cf9c4e6e32cbba6521573cad17d8e29d39451985a3225355720a084120608907\": container with ID starting with cf9c4e6e32cbba6521573cad17d8e29d39451985a3225355720a084120608907 not found: ID does not exist" Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.704792 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fadb3c9-a4dd-44da-8b81-a8a3be611d61" path="/var/lib/kubelet/pods/3fadb3c9-a4dd-44da-8b81-a8a3be611d61/volumes" Mar 22 00:25:02 crc kubenswrapper[5116]: I0322 00:25:02.504643 5116 generic.go:358] "Generic (PLEG): container finished" podID="94167a49-edda-444e-bb01-c8b24c818557" containerID="6c25378c5c67a582ced66fed35969fa2d19e2271e0060a1d9027bc652a4efbd5" exitCode=0 Mar 22 00:25:02 crc kubenswrapper[5116]: I0322 00:25:02.504754 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"94167a49-edda-444e-bb01-c8b24c818557","Type":"ContainerDied","Data":"6c25378c5c67a582ced66fed35969fa2d19e2271e0060a1d9027bc652a4efbd5"} Mar 22 00:25:02 crc kubenswrapper[5116]: I0322 00:25:02.536060 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-2-build_94167a49-edda-444e-bb01-c8b24c818557/manage-dockerfile/0.log" Mar 22 00:25:03 crc kubenswrapper[5116]: I0322 00:25:03.513905 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"94167a49-edda-444e-bb01-c8b24c818557","Type":"ContainerStarted","Data":"32d83cf2cd3027aeeae521f32493abec068b21a8573e03777a412660a6ffef93"} Mar 22 00:25:03 crc kubenswrapper[5116]: I0322 00:25:03.537495 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-2-build" podStartSLOduration=4.537477629 podStartE2EDuration="4.537477629s" podCreationTimestamp="2026-03-22 00:24:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:25:03.532791865 +0000 UTC m=+974.555093248" watchObservedRunningTime="2026-03-22 00:25:03.537477629 +0000 UTC m=+974.559779002" Mar 22 00:25:23 crc kubenswrapper[5116]: I0322 00:25:23.056989 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:25:23 crc kubenswrapper[5116]: I0322 00:25:23.057637 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:25:23 crc kubenswrapper[5116]: I0322 00:25:23.057699 5116 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:25:23 crc kubenswrapper[5116]: I0322 00:25:23.058454 5116 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"04d3547ec13e400d85bea3193ce33dfe9d5bb94e3b6f8025eabfb2cf4e55b029"} pod="openshift-machine-config-operator/machine-config-daemon-66g6d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 22 00:25:23 crc kubenswrapper[5116]: I0322 00:25:23.058538 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" containerID="cri-o://04d3547ec13e400d85bea3193ce33dfe9d5bb94e3b6f8025eabfb2cf4e55b029" gracePeriod=600 Mar 22 00:25:23 crc kubenswrapper[5116]: I0322 00:25:23.643080 5116 generic.go:358] "Generic (PLEG): container finished" podID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerID="04d3547ec13e400d85bea3193ce33dfe9d5bb94e3b6f8025eabfb2cf4e55b029" exitCode=0 Mar 22 00:25:23 crc kubenswrapper[5116]: I0322 00:25:23.643258 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerDied","Data":"04d3547ec13e400d85bea3193ce33dfe9d5bb94e3b6f8025eabfb2cf4e55b029"} Mar 22 00:25:23 crc kubenswrapper[5116]: I0322 00:25:23.643466 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerStarted","Data":"b7eab597c02f83e56182c2492582a8c35b3583e68514ad1f9af8e387f2ee97bd"} Mar 22 00:25:23 crc kubenswrapper[5116]: I0322 00:25:23.643485 5116 scope.go:117] "RemoveContainer" containerID="2bace879b3cb84a0484101d5bad3c5693b65e8c5fa47a655bcdd4b8fee4ab4a2" Mar 22 00:26:00 crc kubenswrapper[5116]: I0322 00:26:00.143890 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29568986-l778x"] Mar 22 00:26:00 crc kubenswrapper[5116]: I0322 00:26:00.146015 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3fadb3c9-a4dd-44da-8b81-a8a3be611d61" containerName="registry-server" Mar 22 00:26:00 crc kubenswrapper[5116]: I0322 00:26:00.146044 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fadb3c9-a4dd-44da-8b81-a8a3be611d61" containerName="registry-server" Mar 22 00:26:00 crc kubenswrapper[5116]: I0322 00:26:00.146095 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3fadb3c9-a4dd-44da-8b81-a8a3be611d61" containerName="extract-content" Mar 22 00:26:00 crc kubenswrapper[5116]: I0322 00:26:00.146107 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fadb3c9-a4dd-44da-8b81-a8a3be611d61" containerName="extract-content" Mar 22 00:26:00 crc kubenswrapper[5116]: I0322 00:26:00.146132 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3fadb3c9-a4dd-44da-8b81-a8a3be611d61" containerName="extract-utilities" Mar 22 00:26:00 crc kubenswrapper[5116]: I0322 00:26:00.146144 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fadb3c9-a4dd-44da-8b81-a8a3be611d61" containerName="extract-utilities" Mar 22 00:26:00 crc kubenswrapper[5116]: I0322 00:26:00.146371 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="3fadb3c9-a4dd-44da-8b81-a8a3be611d61" containerName="registry-server" Mar 22 00:26:00 crc kubenswrapper[5116]: I0322 00:26:00.449144 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568986-l778x"] Mar 22 00:26:00 crc kubenswrapper[5116]: I0322 00:26:00.449320 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568986-l778x" Mar 22 00:26:00 crc kubenswrapper[5116]: I0322 00:26:00.455446 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-zsw2q\"" Mar 22 00:26:00 crc kubenswrapper[5116]: I0322 00:26:00.455872 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 22 00:26:00 crc kubenswrapper[5116]: I0322 00:26:00.467585 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 22 00:26:00 crc kubenswrapper[5116]: I0322 00:26:00.627016 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2smj\" (UniqueName: \"kubernetes.io/projected/d42dbb69-e840-4b6a-b719-52396f82919e-kube-api-access-q2smj\") pod \"auto-csr-approver-29568986-l778x\" (UID: \"d42dbb69-e840-4b6a-b719-52396f82919e\") " pod="openshift-infra/auto-csr-approver-29568986-l778x" Mar 22 00:26:00 crc kubenswrapper[5116]: I0322 00:26:00.728249 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q2smj\" (UniqueName: \"kubernetes.io/projected/d42dbb69-e840-4b6a-b719-52396f82919e-kube-api-access-q2smj\") pod \"auto-csr-approver-29568986-l778x\" (UID: \"d42dbb69-e840-4b6a-b719-52396f82919e\") " pod="openshift-infra/auto-csr-approver-29568986-l778x" Mar 22 00:26:00 crc kubenswrapper[5116]: I0322 00:26:00.759872 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2smj\" (UniqueName: \"kubernetes.io/projected/d42dbb69-e840-4b6a-b719-52396f82919e-kube-api-access-q2smj\") pod \"auto-csr-approver-29568986-l778x\" (UID: \"d42dbb69-e840-4b6a-b719-52396f82919e\") " pod="openshift-infra/auto-csr-approver-29568986-l778x" Mar 22 00:26:00 crc kubenswrapper[5116]: I0322 00:26:00.784019 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568986-l778x" Mar 22 00:26:01 crc kubenswrapper[5116]: I0322 00:26:01.023408 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568986-l778x"] Mar 22 00:26:01 crc kubenswrapper[5116]: I0322 00:26:01.958417 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568986-l778x" event={"ID":"d42dbb69-e840-4b6a-b719-52396f82919e","Type":"ContainerStarted","Data":"aa4ca9133814e5f48cc6f2b6eaadca8bbcb7b070e4190ce04658801f3f251cc0"} Mar 22 00:26:02 crc kubenswrapper[5116]: I0322 00:26:02.965995 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568986-l778x" event={"ID":"d42dbb69-e840-4b6a-b719-52396f82919e","Type":"ContainerStarted","Data":"795e259e79e05542df53f912755d650098dd25713e478a8d61c158fc1da118ef"} Mar 22 00:26:03 crc kubenswrapper[5116]: I0322 00:26:03.983008 5116 generic.go:358] "Generic (PLEG): container finished" podID="d42dbb69-e840-4b6a-b719-52396f82919e" containerID="795e259e79e05542df53f912755d650098dd25713e478a8d61c158fc1da118ef" exitCode=0 Mar 22 00:26:03 crc kubenswrapper[5116]: I0322 00:26:03.983158 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568986-l778x" event={"ID":"d42dbb69-e840-4b6a-b719-52396f82919e","Type":"ContainerDied","Data":"795e259e79e05542df53f912755d650098dd25713e478a8d61c158fc1da118ef"} Mar 22 00:26:05 crc kubenswrapper[5116]: I0322 00:26:05.262302 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568986-l778x" Mar 22 00:26:05 crc kubenswrapper[5116]: I0322 00:26:05.390002 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2smj\" (UniqueName: \"kubernetes.io/projected/d42dbb69-e840-4b6a-b719-52396f82919e-kube-api-access-q2smj\") pod \"d42dbb69-e840-4b6a-b719-52396f82919e\" (UID: \"d42dbb69-e840-4b6a-b719-52396f82919e\") " Mar 22 00:26:05 crc kubenswrapper[5116]: I0322 00:26:05.398096 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d42dbb69-e840-4b6a-b719-52396f82919e-kube-api-access-q2smj" (OuterVolumeSpecName: "kube-api-access-q2smj") pod "d42dbb69-e840-4b6a-b719-52396f82919e" (UID: "d42dbb69-e840-4b6a-b719-52396f82919e"). InnerVolumeSpecName "kube-api-access-q2smj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:26:05 crc kubenswrapper[5116]: I0322 00:26:05.491620 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q2smj\" (UniqueName: \"kubernetes.io/projected/d42dbb69-e840-4b6a-b719-52396f82919e-kube-api-access-q2smj\") on node \"crc\" DevicePath \"\"" Mar 22 00:26:05 crc kubenswrapper[5116]: I0322 00:26:05.999593 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568986-l778x" Mar 22 00:26:05 crc kubenswrapper[5116]: I0322 00:26:05.999665 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568986-l778x" event={"ID":"d42dbb69-e840-4b6a-b719-52396f82919e","Type":"ContainerDied","Data":"aa4ca9133814e5f48cc6f2b6eaadca8bbcb7b070e4190ce04658801f3f251cc0"} Mar 22 00:26:05 crc kubenswrapper[5116]: I0322 00:26:05.999701 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa4ca9133814e5f48cc6f2b6eaadca8bbcb7b070e4190ce04658801f3f251cc0" Mar 22 00:26:06 crc kubenswrapper[5116]: I0322 00:26:06.324070 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29568980-ksbk2"] Mar 22 00:26:06 crc kubenswrapper[5116]: I0322 00:26:06.332436 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29568980-ksbk2"] Mar 22 00:26:07 crc kubenswrapper[5116]: I0322 00:26:07.708562 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11840734-dc87-4532-b341-aeb889f011c4" path="/var/lib/kubelet/pods/11840734-dc87-4532-b341-aeb889f011c4/volumes" Mar 22 00:26:50 crc kubenswrapper[5116]: I0322 00:26:50.703634 5116 scope.go:117] "RemoveContainer" containerID="8af1cbb249920bcf15fdbe5b5c0f46798a3269524c401605be7ddebc1e8b62d8" Mar 22 00:27:23 crc kubenswrapper[5116]: I0322 00:27:23.057046 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:27:23 crc kubenswrapper[5116]: I0322 00:27:23.057679 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:27:53 crc kubenswrapper[5116]: I0322 00:27:53.057214 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:27:53 crc kubenswrapper[5116]: I0322 00:27:53.057766 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:28:00 crc kubenswrapper[5116]: I0322 00:28:00.135597 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29568988-gbkp8"] Mar 22 00:28:00 crc kubenswrapper[5116]: I0322 00:28:00.136993 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d42dbb69-e840-4b6a-b719-52396f82919e" containerName="oc" Mar 22 00:28:00 crc kubenswrapper[5116]: I0322 00:28:00.137013 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="d42dbb69-e840-4b6a-b719-52396f82919e" containerName="oc" Mar 22 00:28:00 crc kubenswrapper[5116]: I0322 00:28:00.137157 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="d42dbb69-e840-4b6a-b719-52396f82919e" containerName="oc" Mar 22 00:28:00 crc kubenswrapper[5116]: I0322 00:28:00.159158 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568988-gbkp8"] Mar 22 00:28:00 crc kubenswrapper[5116]: I0322 00:28:00.159305 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568988-gbkp8" Mar 22 00:28:00 crc kubenswrapper[5116]: I0322 00:28:00.162103 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 22 00:28:00 crc kubenswrapper[5116]: I0322 00:28:00.162108 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 22 00:28:00 crc kubenswrapper[5116]: I0322 00:28:00.163000 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-zsw2q\"" Mar 22 00:28:00 crc kubenswrapper[5116]: I0322 00:28:00.241132 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h9v8\" (UniqueName: \"kubernetes.io/projected/135f0dfe-78e2-4264-ae7b-7d6b95ebbb39-kube-api-access-2h9v8\") pod \"auto-csr-approver-29568988-gbkp8\" (UID: \"135f0dfe-78e2-4264-ae7b-7d6b95ebbb39\") " pod="openshift-infra/auto-csr-approver-29568988-gbkp8" Mar 22 00:28:00 crc kubenswrapper[5116]: I0322 00:28:00.342381 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2h9v8\" (UniqueName: \"kubernetes.io/projected/135f0dfe-78e2-4264-ae7b-7d6b95ebbb39-kube-api-access-2h9v8\") pod \"auto-csr-approver-29568988-gbkp8\" (UID: \"135f0dfe-78e2-4264-ae7b-7d6b95ebbb39\") " pod="openshift-infra/auto-csr-approver-29568988-gbkp8" Mar 22 00:28:00 crc kubenswrapper[5116]: I0322 00:28:00.363417 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h9v8\" (UniqueName: \"kubernetes.io/projected/135f0dfe-78e2-4264-ae7b-7d6b95ebbb39-kube-api-access-2h9v8\") pod \"auto-csr-approver-29568988-gbkp8\" (UID: \"135f0dfe-78e2-4264-ae7b-7d6b95ebbb39\") " pod="openshift-infra/auto-csr-approver-29568988-gbkp8" Mar 22 00:28:00 crc kubenswrapper[5116]: I0322 00:28:00.476762 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568988-gbkp8" Mar 22 00:28:00 crc kubenswrapper[5116]: I0322 00:28:00.684948 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568988-gbkp8"] Mar 22 00:28:00 crc kubenswrapper[5116]: I0322 00:28:00.746586 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568988-gbkp8" event={"ID":"135f0dfe-78e2-4264-ae7b-7d6b95ebbb39","Type":"ContainerStarted","Data":"730f3b13214d7bbd73298973658c8d4e1cd3372e57cc1ed17de997720a904f1a"} Mar 22 00:28:06 crc kubenswrapper[5116]: I0322 00:28:06.788696 5116 generic.go:358] "Generic (PLEG): container finished" podID="135f0dfe-78e2-4264-ae7b-7d6b95ebbb39" containerID="6aa0210430a18d2b96a0bdd3c189fd69455e4afb5753bc588ac349572da2555d" exitCode=0 Mar 22 00:28:06 crc kubenswrapper[5116]: I0322 00:28:06.788806 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568988-gbkp8" event={"ID":"135f0dfe-78e2-4264-ae7b-7d6b95ebbb39","Type":"ContainerDied","Data":"6aa0210430a18d2b96a0bdd3c189fd69455e4afb5753bc588ac349572da2555d"} Mar 22 00:28:08 crc kubenswrapper[5116]: I0322 00:28:08.006872 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568988-gbkp8" Mar 22 00:28:08 crc kubenswrapper[5116]: I0322 00:28:08.154126 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h9v8\" (UniqueName: \"kubernetes.io/projected/135f0dfe-78e2-4264-ae7b-7d6b95ebbb39-kube-api-access-2h9v8\") pod \"135f0dfe-78e2-4264-ae7b-7d6b95ebbb39\" (UID: \"135f0dfe-78e2-4264-ae7b-7d6b95ebbb39\") " Mar 22 00:28:08 crc kubenswrapper[5116]: I0322 00:28:08.164855 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/135f0dfe-78e2-4264-ae7b-7d6b95ebbb39-kube-api-access-2h9v8" (OuterVolumeSpecName: "kube-api-access-2h9v8") pod "135f0dfe-78e2-4264-ae7b-7d6b95ebbb39" (UID: "135f0dfe-78e2-4264-ae7b-7d6b95ebbb39"). InnerVolumeSpecName "kube-api-access-2h9v8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:28:08 crc kubenswrapper[5116]: I0322 00:28:08.256305 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2h9v8\" (UniqueName: \"kubernetes.io/projected/135f0dfe-78e2-4264-ae7b-7d6b95ebbb39-kube-api-access-2h9v8\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:08 crc kubenswrapper[5116]: I0322 00:28:08.802617 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568988-gbkp8" Mar 22 00:28:08 crc kubenswrapper[5116]: I0322 00:28:08.802647 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568988-gbkp8" event={"ID":"135f0dfe-78e2-4264-ae7b-7d6b95ebbb39","Type":"ContainerDied","Data":"730f3b13214d7bbd73298973658c8d4e1cd3372e57cc1ed17de997720a904f1a"} Mar 22 00:28:08 crc kubenswrapper[5116]: I0322 00:28:08.802688 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="730f3b13214d7bbd73298973658c8d4e1cd3372e57cc1ed17de997720a904f1a" Mar 22 00:28:09 crc kubenswrapper[5116]: I0322 00:28:09.072484 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29568982-7k9wz"] Mar 22 00:28:09 crc kubenswrapper[5116]: I0322 00:28:09.076256 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29568982-7k9wz"] Mar 22 00:28:09 crc kubenswrapper[5116]: I0322 00:28:09.712271 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26bdb492-c2c3-48e7-b86a-b83cb2f4aea5" path="/var/lib/kubelet/pods/26bdb492-c2c3-48e7-b86a-b83cb2f4aea5/volumes" Mar 22 00:28:15 crc kubenswrapper[5116]: I0322 00:28:15.863700 5116 generic.go:358] "Generic (PLEG): container finished" podID="94167a49-edda-444e-bb01-c8b24c818557" containerID="32d83cf2cd3027aeeae521f32493abec068b21a8573e03777a412660a6ffef93" exitCode=0 Mar 22 00:28:15 crc kubenswrapper[5116]: I0322 00:28:15.863798 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"94167a49-edda-444e-bb01-c8b24c818557","Type":"ContainerDied","Data":"32d83cf2cd3027aeeae521f32493abec068b21a8573e03777a412660a6ffef93"} Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.221815 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.289897 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwt49\" (UniqueName: \"kubernetes.io/projected/94167a49-edda-444e-bb01-c8b24c818557-kube-api-access-lwt49\") pod \"94167a49-edda-444e-bb01-c8b24c818557\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.289953 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-container-storage-root\") pod \"94167a49-edda-444e-bb01-c8b24c818557\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.289980 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/94167a49-edda-444e-bb01-c8b24c818557-builder-dockercfg-qv5f4-push\") pod \"94167a49-edda-444e-bb01-c8b24c818557\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.290005 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-build-blob-cache\") pod \"94167a49-edda-444e-bb01-c8b24c818557\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.290026 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/94167a49-edda-444e-bb01-c8b24c818557-node-pullsecrets\") pod \"94167a49-edda-444e-bb01-c8b24c818557\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.290050 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/94167a49-edda-444e-bb01-c8b24c818557-buildcachedir\") pod \"94167a49-edda-444e-bb01-c8b24c818557\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.290077 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/94167a49-edda-444e-bb01-c8b24c818557-builder-dockercfg-qv5f4-pull\") pod \"94167a49-edda-444e-bb01-c8b24c818557\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.290188 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94167a49-edda-444e-bb01-c8b24c818557-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "94167a49-edda-444e-bb01-c8b24c818557" (UID: "94167a49-edda-444e-bb01-c8b24c818557"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.290215 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-proxy-ca-bundles\") pod \"94167a49-edda-444e-bb01-c8b24c818557\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.290255 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-ca-bundles\") pod \"94167a49-edda-444e-bb01-c8b24c818557\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.290285 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-buildworkdir\") pod \"94167a49-edda-444e-bb01-c8b24c818557\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.290293 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94167a49-edda-444e-bb01-c8b24c818557-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "94167a49-edda-444e-bb01-c8b24c818557" (UID: "94167a49-edda-444e-bb01-c8b24c818557"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.290322 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-system-configs\") pod \"94167a49-edda-444e-bb01-c8b24c818557\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.290430 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-container-storage-run\") pod \"94167a49-edda-444e-bb01-c8b24c818557\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.291012 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "94167a49-edda-444e-bb01-c8b24c818557" (UID: "94167a49-edda-444e-bb01-c8b24c818557"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.291207 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "94167a49-edda-444e-bb01-c8b24c818557" (UID: "94167a49-edda-444e-bb01-c8b24c818557"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.291241 5116 reconciler_common.go:299] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/94167a49-edda-444e-bb01-c8b24c818557-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.291365 5116 reconciler_common.go:299] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.291425 5116 reconciler_common.go:299] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/94167a49-edda-444e-bb01-c8b24c818557-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.335058 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "94167a49-edda-444e-bb01-c8b24c818557" (UID: "94167a49-edda-444e-bb01-c8b24c818557"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.373894 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "94167a49-edda-444e-bb01-c8b24c818557" (UID: "94167a49-edda-444e-bb01-c8b24c818557"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.381502 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94167a49-edda-444e-bb01-c8b24c818557-builder-dockercfg-qv5f4-push" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-push") pod "94167a49-edda-444e-bb01-c8b24c818557" (UID: "94167a49-edda-444e-bb01-c8b24c818557"). InnerVolumeSpecName "builder-dockercfg-qv5f4-push". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.393005 5116 reconciler_common.go:299] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.393045 5116 reconciler_common.go:299] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.393058 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.393070 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/94167a49-edda-444e-bb01-c8b24c818557-builder-dockercfg-qv5f4-push\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.419664 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94167a49-edda-444e-bb01-c8b24c818557-builder-dockercfg-qv5f4-pull" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-pull") pod "94167a49-edda-444e-bb01-c8b24c818557" (UID: "94167a49-edda-444e-bb01-c8b24c818557"). InnerVolumeSpecName "builder-dockercfg-qv5f4-pull". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.432410 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "94167a49-edda-444e-bb01-c8b24c818557" (UID: "94167a49-edda-444e-bb01-c8b24c818557"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.436590 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94167a49-edda-444e-bb01-c8b24c818557-kube-api-access-lwt49" (OuterVolumeSpecName: "kube-api-access-lwt49") pod "94167a49-edda-444e-bb01-c8b24c818557" (UID: "94167a49-edda-444e-bb01-c8b24c818557"). InnerVolumeSpecName "kube-api-access-lwt49". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.494375 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/94167a49-edda-444e-bb01-c8b24c818557-builder-dockercfg-qv5f4-pull\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.494421 5116 reconciler_common.go:299] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.494439 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lwt49\" (UniqueName: \"kubernetes.io/projected/94167a49-edda-444e-bb01-c8b24c818557-kube-api-access-lwt49\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.661750 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "94167a49-edda-444e-bb01-c8b24c818557" (UID: "94167a49-edda-444e-bb01-c8b24c818557"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.696841 5116 reconciler_common.go:299] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.884496 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"94167a49-edda-444e-bb01-c8b24c818557","Type":"ContainerDied","Data":"53613d79a9b2b5b46d05a807ddfaa532608c11cacf61914f830a9347006a7459"} Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.884544 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53613d79a9b2b5b46d05a807ddfaa532608c11cacf61914f830a9347006a7459" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.884628 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 22 00:28:20 crc kubenswrapper[5116]: I0322 00:28:20.415671 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "94167a49-edda-444e-bb01-c8b24c818557" (UID: "94167a49-edda-444e-bb01-c8b24c818557"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:28:20 crc kubenswrapper[5116]: I0322 00:28:20.436442 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.035080 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.036830 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="135f0dfe-78e2-4264-ae7b-7d6b95ebbb39" containerName="oc" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.036864 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="135f0dfe-78e2-4264-ae7b-7d6b95ebbb39" containerName="oc" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.036883 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94167a49-edda-444e-bb01-c8b24c818557" containerName="docker-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.036890 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="94167a49-edda-444e-bb01-c8b24c818557" containerName="docker-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.036910 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94167a49-edda-444e-bb01-c8b24c818557" containerName="git-clone" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.036929 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="94167a49-edda-444e-bb01-c8b24c818557" containerName="git-clone" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.036938 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94167a49-edda-444e-bb01-c8b24c818557" containerName="manage-dockerfile" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.036944 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="94167a49-edda-444e-bb01-c8b24c818557" containerName="manage-dockerfile" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.037064 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="135f0dfe-78e2-4264-ae7b-7d6b95ebbb39" containerName="oc" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.037082 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="94167a49-edda-444e-bb01-c8b24c818557" containerName="docker-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.097749 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.097920 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.100603 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"sg-bridge-1-sys-config\"" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.100624 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"builder-dockercfg-qv5f4\"" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.100629 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"sg-bridge-1-global-ca\"" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.100761 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"sg-bridge-1-ca\"" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.270236 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.270296 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d7582c75-581a-41a0-9a56-c0eda9df5932-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.270325 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/d7582c75-581a-41a0-9a56-c0eda9df5932-builder-dockercfg-qv5f4-push\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.270512 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.270580 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.270643 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.270672 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.270715 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d7582c75-581a-41a0-9a56-c0eda9df5932-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.270746 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntttm\" (UniqueName: \"kubernetes.io/projected/d7582c75-581a-41a0-9a56-c0eda9df5932-kube-api-access-ntttm\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.270834 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/d7582c75-581a-41a0-9a56-c0eda9df5932-builder-dockercfg-qv5f4-pull\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.270904 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.270930 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.372543 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.372610 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.372642 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.372657 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.372682 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d7582c75-581a-41a0-9a56-c0eda9df5932-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.372698 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ntttm\" (UniqueName: \"kubernetes.io/projected/d7582c75-581a-41a0-9a56-c0eda9df5932-kube-api-access-ntttm\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.372728 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/d7582c75-581a-41a0-9a56-c0eda9df5932-builder-dockercfg-qv5f4-pull\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.372759 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.372774 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.373060 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d7582c75-581a-41a0-9a56-c0eda9df5932-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.373199 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.373388 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d7582c75-581a-41a0-9a56-c0eda9df5932-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.373424 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.373438 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/d7582c75-581a-41a0-9a56-c0eda9df5932-builder-dockercfg-qv5f4-push\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.373558 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.373628 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.373657 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.373662 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.373824 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d7582c75-581a-41a0-9a56-c0eda9df5932-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.373890 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.374477 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.382800 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/d7582c75-581a-41a0-9a56-c0eda9df5932-builder-dockercfg-qv5f4-push\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.383615 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/d7582c75-581a-41a0-9a56-c0eda9df5932-builder-dockercfg-qv5f4-pull\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.389861 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntttm\" (UniqueName: \"kubernetes.io/projected/d7582c75-581a-41a0-9a56-c0eda9df5932-kube-api-access-ntttm\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.409993 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.610019 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.930146 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"d7582c75-581a-41a0-9a56-c0eda9df5932","Type":"ContainerStarted","Data":"771bfa800f24952fe981a0692f46e9fb7737913672821b07cfca47622fd48f98"} Mar 22 00:28:23 crc kubenswrapper[5116]: I0322 00:28:23.057522 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:28:23 crc kubenswrapper[5116]: I0322 00:28:23.057599 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:28:23 crc kubenswrapper[5116]: I0322 00:28:23.057656 5116 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:28:23 crc kubenswrapper[5116]: I0322 00:28:23.058302 5116 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b7eab597c02f83e56182c2492582a8c35b3583e68514ad1f9af8e387f2ee97bd"} pod="openshift-machine-config-operator/machine-config-daemon-66g6d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 22 00:28:23 crc kubenswrapper[5116]: I0322 00:28:23.058368 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" containerID="cri-o://b7eab597c02f83e56182c2492582a8c35b3583e68514ad1f9af8e387f2ee97bd" gracePeriod=600 Mar 22 00:28:23 crc kubenswrapper[5116]: I0322 00:28:23.944391 5116 generic.go:358] "Generic (PLEG): container finished" podID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerID="b7eab597c02f83e56182c2492582a8c35b3583e68514ad1f9af8e387f2ee97bd" exitCode=0 Mar 22 00:28:23 crc kubenswrapper[5116]: I0322 00:28:23.944504 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerDied","Data":"b7eab597c02f83e56182c2492582a8c35b3583e68514ad1f9af8e387f2ee97bd"} Mar 22 00:28:23 crc kubenswrapper[5116]: I0322 00:28:23.944836 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerStarted","Data":"49fe123552b6212becb6925edd45f7eeb8e4b240ca3f762ba113af9cc73657f3"} Mar 22 00:28:23 crc kubenswrapper[5116]: I0322 00:28:23.944873 5116 scope.go:117] "RemoveContainer" containerID="04d3547ec13e400d85bea3193ce33dfe9d5bb94e3b6f8025eabfb2cf4e55b029" Mar 22 00:28:23 crc kubenswrapper[5116]: I0322 00:28:23.948442 5116 generic.go:358] "Generic (PLEG): container finished" podID="d7582c75-581a-41a0-9a56-c0eda9df5932" containerID="d50c7a1c36052a74413838c90756db463a10e07fe6248f6acd9c877e8aa8ecfb" exitCode=0 Mar 22 00:28:23 crc kubenswrapper[5116]: I0322 00:28:23.948649 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"d7582c75-581a-41a0-9a56-c0eda9df5932","Type":"ContainerDied","Data":"d50c7a1c36052a74413838c90756db463a10e07fe6248f6acd9c877e8aa8ecfb"} Mar 22 00:28:24 crc kubenswrapper[5116]: I0322 00:28:24.959078 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"d7582c75-581a-41a0-9a56-c0eda9df5932","Type":"ContainerStarted","Data":"096ae5fa3279f5daede4cba4128f49869a9b4e67267b09ec957a247283477413"} Mar 22 00:28:24 crc kubenswrapper[5116]: I0322 00:28:24.977385 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-1-build" podStartSLOduration=2.977367562 podStartE2EDuration="2.977367562s" podCreationTimestamp="2026-03-22 00:28:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:28:24.976492995 +0000 UTC m=+1175.998794368" watchObservedRunningTime="2026-03-22 00:28:24.977367562 +0000 UTC m=+1175.999668935" Mar 22 00:28:33 crc kubenswrapper[5116]: I0322 00:28:32.506281 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 22 00:28:33 crc kubenswrapper[5116]: I0322 00:28:32.507157 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="service-telemetry/sg-bridge-1-build" podUID="d7582c75-581a-41a0-9a56-c0eda9df5932" containerName="docker-build" containerID="cri-o://096ae5fa3279f5daede4cba4128f49869a9b4e67267b09ec957a247283477413" gracePeriod=30 Mar 22 00:28:34 crc kubenswrapper[5116]: I0322 00:28:34.876773 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.263859 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.264135 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.267083 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"sg-bridge-2-global-ca\"" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.267083 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"sg-bridge-2-sys-config\"" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.267361 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"sg-bridge-2-ca\"" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.376244 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.376368 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6cg7\" (UniqueName: \"kubernetes.io/projected/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-kube-api-access-r6cg7\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.376411 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.376606 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.376692 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-builder-dockercfg-qv5f4-push\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.376814 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.376868 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.376951 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.376996 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.377044 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.377076 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-builder-dockercfg-qv5f4-pull\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.377213 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.479303 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.479722 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.479746 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-builder-dockercfg-qv5f4-push\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.480054 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.480134 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.480239 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.480396 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.480502 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.480557 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-builder-dockercfg-qv5f4-pull\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.480484 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.480667 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.480858 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.481021 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r6cg7\" (UniqueName: \"kubernetes.io/projected/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-kube-api-access-r6cg7\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.481073 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.481765 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.482004 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.482417 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.482436 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.482437 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.482854 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.483086 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.500158 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-builder-dockercfg-qv5f4-push\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.500863 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-builder-dockercfg-qv5f4-pull\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.514076 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6cg7\" (UniqueName: \"kubernetes.io/projected/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-kube-api-access-r6cg7\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.583630 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.046484 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_d7582c75-581a-41a0-9a56-c0eda9df5932/docker-build/0.log" Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.047380 5116 generic.go:358] "Generic (PLEG): container finished" podID="d7582c75-581a-41a0-9a56-c0eda9df5932" containerID="096ae5fa3279f5daede4cba4128f49869a9b4e67267b09ec957a247283477413" exitCode=1 Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.047582 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"d7582c75-581a-41a0-9a56-c0eda9df5932","Type":"ContainerDied","Data":"096ae5fa3279f5daede4cba4128f49869a9b4e67267b09ec957a247283477413"} Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.135713 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 22 00:28:36 crc kubenswrapper[5116]: W0322 00:28:36.141530 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef337a4f_2072_4bd4_b8bf_c6f17e8de5c0.slice/crio-295b8055add135e795a4f83a474e1897406680ac19e00964983493bcb6166972 WatchSource:0}: Error finding container 295b8055add135e795a4f83a474e1897406680ac19e00964983493bcb6166972: Status 404 returned error can't find the container with id 295b8055add135e795a4f83a474e1897406680ac19e00964983493bcb6166972 Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.772632 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_d7582c75-581a-41a0-9a56-c0eda9df5932/docker-build/0.log" Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.773369 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.903517 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-system-configs\") pod \"d7582c75-581a-41a0-9a56-c0eda9df5932\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.903561 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d7582c75-581a-41a0-9a56-c0eda9df5932-buildcachedir\") pod \"d7582c75-581a-41a0-9a56-c0eda9df5932\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.903579 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/d7582c75-581a-41a0-9a56-c0eda9df5932-builder-dockercfg-qv5f4-pull\") pod \"d7582c75-581a-41a0-9a56-c0eda9df5932\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.903625 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-container-storage-root\") pod \"d7582c75-581a-41a0-9a56-c0eda9df5932\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.903678 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntttm\" (UniqueName: \"kubernetes.io/projected/d7582c75-581a-41a0-9a56-c0eda9df5932-kube-api-access-ntttm\") pod \"d7582c75-581a-41a0-9a56-c0eda9df5932\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.903696 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-ca-bundles\") pod \"d7582c75-581a-41a0-9a56-c0eda9df5932\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.903698 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d7582c75-581a-41a0-9a56-c0eda9df5932-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "d7582c75-581a-41a0-9a56-c0eda9df5932" (UID: "d7582c75-581a-41a0-9a56-c0eda9df5932"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.903724 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-build-blob-cache\") pod \"d7582c75-581a-41a0-9a56-c0eda9df5932\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.903782 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-buildworkdir\") pod \"d7582c75-581a-41a0-9a56-c0eda9df5932\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.903898 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d7582c75-581a-41a0-9a56-c0eda9df5932-node-pullsecrets\") pod \"d7582c75-581a-41a0-9a56-c0eda9df5932\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.903960 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-proxy-ca-bundles\") pod \"d7582c75-581a-41a0-9a56-c0eda9df5932\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.904060 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/d7582c75-581a-41a0-9a56-c0eda9df5932-builder-dockercfg-qv5f4-push\") pod \"d7582c75-581a-41a0-9a56-c0eda9df5932\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.904110 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-container-storage-run\") pod \"d7582c75-581a-41a0-9a56-c0eda9df5932\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.904573 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "d7582c75-581a-41a0-9a56-c0eda9df5932" (UID: "d7582c75-581a-41a0-9a56-c0eda9df5932"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.904761 5116 reconciler_common.go:299] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.904815 5116 reconciler_common.go:299] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d7582c75-581a-41a0-9a56-c0eda9df5932-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.905498 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "d7582c75-581a-41a0-9a56-c0eda9df5932" (UID: "d7582c75-581a-41a0-9a56-c0eda9df5932"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.905752 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d7582c75-581a-41a0-9a56-c0eda9df5932-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "d7582c75-581a-41a0-9a56-c0eda9df5932" (UID: "d7582c75-581a-41a0-9a56-c0eda9df5932"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.906219 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "d7582c75-581a-41a0-9a56-c0eda9df5932" (UID: "d7582c75-581a-41a0-9a56-c0eda9df5932"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.906439 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "d7582c75-581a-41a0-9a56-c0eda9df5932" (UID: "d7582c75-581a-41a0-9a56-c0eda9df5932"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.907470 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "d7582c75-581a-41a0-9a56-c0eda9df5932" (UID: "d7582c75-581a-41a0-9a56-c0eda9df5932"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.907578 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "d7582c75-581a-41a0-9a56-c0eda9df5932" (UID: "d7582c75-581a-41a0-9a56-c0eda9df5932"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.911569 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7582c75-581a-41a0-9a56-c0eda9df5932-kube-api-access-ntttm" (OuterVolumeSpecName: "kube-api-access-ntttm") pod "d7582c75-581a-41a0-9a56-c0eda9df5932" (UID: "d7582c75-581a-41a0-9a56-c0eda9df5932"). InnerVolumeSpecName "kube-api-access-ntttm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.912782 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7582c75-581a-41a0-9a56-c0eda9df5932-builder-dockercfg-qv5f4-pull" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-pull") pod "d7582c75-581a-41a0-9a56-c0eda9df5932" (UID: "d7582c75-581a-41a0-9a56-c0eda9df5932"). InnerVolumeSpecName "builder-dockercfg-qv5f4-pull". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.917630 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7582c75-581a-41a0-9a56-c0eda9df5932-builder-dockercfg-qv5f4-push" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-push") pod "d7582c75-581a-41a0-9a56-c0eda9df5932" (UID: "d7582c75-581a-41a0-9a56-c0eda9df5932"). InnerVolumeSpecName "builder-dockercfg-qv5f4-push". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.977364 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "d7582c75-581a-41a0-9a56-c0eda9df5932" (UID: "d7582c75-581a-41a0-9a56-c0eda9df5932"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.006210 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/d7582c75-581a-41a0-9a56-c0eda9df5932-builder-dockercfg-qv5f4-push\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.006473 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.006635 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/d7582c75-581a-41a0-9a56-c0eda9df5932-builder-dockercfg-qv5f4-pull\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.006810 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.007024 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ntttm\" (UniqueName: \"kubernetes.io/projected/d7582c75-581a-41a0-9a56-c0eda9df5932-kube-api-access-ntttm\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.007232 5116 reconciler_common.go:299] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.007468 5116 reconciler_common.go:299] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.007612 5116 reconciler_common.go:299] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.007743 5116 reconciler_common.go:299] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d7582c75-581a-41a0-9a56-c0eda9df5932-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.007872 5116 reconciler_common.go:299] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.058191 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0","Type":"ContainerStarted","Data":"05b58eee940a9312ea4c72bf7cd6002eb63f68736ff8b05594077378816eeb27"} Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.058246 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0","Type":"ContainerStarted","Data":"295b8055add135e795a4f83a474e1897406680ac19e00964983493bcb6166972"} Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.061247 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_d7582c75-581a-41a0-9a56-c0eda9df5932/docker-build/0.log" Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.061589 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"d7582c75-581a-41a0-9a56-c0eda9df5932","Type":"ContainerDied","Data":"771bfa800f24952fe981a0692f46e9fb7737913672821b07cfca47622fd48f98"} Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.061634 5116 scope.go:117] "RemoveContainer" containerID="096ae5fa3279f5daede4cba4128f49869a9b4e67267b09ec957a247283477413" Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.061782 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.099277 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.106988 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.112201 5116 scope.go:117] "RemoveContainer" containerID="d50c7a1c36052a74413838c90756db463a10e07fe6248f6acd9c877e8aa8ecfb" Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.704275 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7582c75-581a-41a0-9a56-c0eda9df5932" path="/var/lib/kubelet/pods/d7582c75-581a-41a0-9a56-c0eda9df5932/volumes" Mar 22 00:28:38 crc kubenswrapper[5116]: I0322 00:28:38.072484 5116 generic.go:358] "Generic (PLEG): container finished" podID="ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" containerID="05b58eee940a9312ea4c72bf7cd6002eb63f68736ff8b05594077378816eeb27" exitCode=0 Mar 22 00:28:38 crc kubenswrapper[5116]: I0322 00:28:38.072553 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0","Type":"ContainerDied","Data":"05b58eee940a9312ea4c72bf7cd6002eb63f68736ff8b05594077378816eeb27"} Mar 22 00:28:39 crc kubenswrapper[5116]: I0322 00:28:39.080723 5116 generic.go:358] "Generic (PLEG): container finished" podID="ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" containerID="095801c54210d9a2713b5cbe7ba8078ce5eb8730ede892ce28c5c1b4dcc05f21" exitCode=0 Mar 22 00:28:39 crc kubenswrapper[5116]: I0322 00:28:39.080767 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0","Type":"ContainerDied","Data":"095801c54210d9a2713b5cbe7ba8078ce5eb8730ede892ce28c5c1b4dcc05f21"} Mar 22 00:28:39 crc kubenswrapper[5116]: I0322 00:28:39.121323 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-2-build_ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0/manage-dockerfile/0.log" Mar 22 00:28:40 crc kubenswrapper[5116]: I0322 00:28:40.089061 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0","Type":"ContainerStarted","Data":"4fdc43f3d25442590d2fddb2c31c0e77f73d2b4f4d2d5df7d52f1ad537e21d5d"} Mar 22 00:28:40 crc kubenswrapper[5116]: I0322 00:28:40.110648 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-2-build" podStartSLOduration=6.110631231 podStartE2EDuration="6.110631231s" podCreationTimestamp="2026-03-22 00:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:28:40.110112985 +0000 UTC m=+1191.132414358" watchObservedRunningTime="2026-03-22 00:28:40.110631231 +0000 UTC m=+1191.132932604" Mar 22 00:28:50 crc kubenswrapper[5116]: I0322 00:28:50.056684 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9sq6c_5188f25b-37c3-46f1-b939-199c6e082848/kube-multus/0.log" Mar 22 00:28:50 crc kubenswrapper[5116]: I0322 00:28:50.075962 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9sq6c_5188f25b-37c3-46f1-b939-199c6e082848/kube-multus/0.log" Mar 22 00:28:50 crc kubenswrapper[5116]: I0322 00:28:50.082565 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 22 00:28:50 crc kubenswrapper[5116]: I0322 00:28:50.085006 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 22 00:28:50 crc kubenswrapper[5116]: I0322 00:28:50.836809 5116 scope.go:117] "RemoveContainer" containerID="2ddc5261a5a71c38bc0367d296e6678e1f3648ba5a05ce953a8407e9e3ce8a74" Mar 22 00:29:28 crc kubenswrapper[5116]: I0322 00:29:28.422775 5116 generic.go:358] "Generic (PLEG): container finished" podID="ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" containerID="4fdc43f3d25442590d2fddb2c31c0e77f73d2b4f4d2d5df7d52f1ad537e21d5d" exitCode=0 Mar 22 00:29:28 crc kubenswrapper[5116]: I0322 00:29:28.422942 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0","Type":"ContainerDied","Data":"4fdc43f3d25442590d2fddb2c31c0e77f73d2b4f4d2d5df7d52f1ad537e21d5d"} Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.702289 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.854199 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-builder-dockercfg-qv5f4-pull\") pod \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.854263 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-builder-dockercfg-qv5f4-push\") pod \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.854283 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-ca-bundles\") pod \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.854315 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6cg7\" (UniqueName: \"kubernetes.io/projected/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-kube-api-access-r6cg7\") pod \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.854338 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-blob-cache\") pod \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.854837 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-buildcachedir\") pod \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.854908 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-proxy-ca-bundles\") pod \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.854923 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" (UID: "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.854944 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-system-configs\") pod \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.854978 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-container-storage-root\") pod \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.855002 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-node-pullsecrets\") pod \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.855016 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-buildworkdir\") pod \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.855068 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-container-storage-run\") pod \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.855349 5116 reconciler_common.go:299] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.855335 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" (UID: "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.855526 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" (UID: "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.855812 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" (UID: "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.856352 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" (UID: "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.857315 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" (UID: "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.857953 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" (UID: "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.859291 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-builder-dockercfg-qv5f4-push" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-push") pod "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" (UID: "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0"). InnerVolumeSpecName "builder-dockercfg-qv5f4-push". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.859452 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-builder-dockercfg-qv5f4-pull" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-pull") pod "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" (UID: "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0"). InnerVolumeSpecName "builder-dockercfg-qv5f4-pull". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.859548 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-kube-api-access-r6cg7" (OuterVolumeSpecName: "kube-api-access-r6cg7") pod "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" (UID: "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0"). InnerVolumeSpecName "kube-api-access-r6cg7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.956239 5116 reconciler_common.go:299] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.956269 5116 reconciler_common.go:299] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.956278 5116 reconciler_common.go:299] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.956287 5116 reconciler_common.go:299] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.956295 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.956303 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-builder-dockercfg-qv5f4-pull\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.956311 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-builder-dockercfg-qv5f4-push\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.956319 5116 reconciler_common.go:299] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.956326 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r6cg7\" (UniqueName: \"kubernetes.io/projected/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-kube-api-access-r6cg7\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.988175 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" (UID: "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:29:30 crc kubenswrapper[5116]: I0322 00:29:30.058414 5116 reconciler_common.go:299] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:30 crc kubenswrapper[5116]: I0322 00:29:30.451684 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0","Type":"ContainerDied","Data":"295b8055add135e795a4f83a474e1897406680ac19e00964983493bcb6166972"} Mar 22 00:29:30 crc kubenswrapper[5116]: I0322 00:29:30.451729 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="295b8055add135e795a4f83a474e1897406680ac19e00964983493bcb6166972" Mar 22 00:29:30 crc kubenswrapper[5116]: I0322 00:29:30.451822 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 22 00:29:30 crc kubenswrapper[5116]: I0322 00:29:30.652006 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" (UID: "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:29:30 crc kubenswrapper[5116]: I0322 00:29:30.667756 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:30 crc kubenswrapper[5116]: E0322 00:29:30.782408 5116 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef337a4f_2072_4bd4_b8bf_c6f17e8de5c0.slice\": RecentStats: unable to find data in memory cache]" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.007984 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.009149 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7582c75-581a-41a0-9a56-c0eda9df5932" containerName="manage-dockerfile" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.009188 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7582c75-581a-41a0-9a56-c0eda9df5932" containerName="manage-dockerfile" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.009215 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" containerName="manage-dockerfile" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.009223 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" containerName="manage-dockerfile" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.009236 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" containerName="docker-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.009244 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" containerName="docker-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.009281 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7582c75-581a-41a0-9a56-c0eda9df5932" containerName="docker-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.009289 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7582c75-581a-41a0-9a56-c0eda9df5932" containerName="docker-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.009297 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" containerName="git-clone" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.009306 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" containerName="git-clone" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.009442 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7582c75-581a-41a0-9a56-c0eda9df5932" containerName="docker-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.009458 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" containerName="docker-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.039048 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.039223 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.041960 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"builder-dockercfg-qv5f4\"" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.042099 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"prometheus-webhook-snmp-1-global-ca\"" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.042717 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"prometheus-webhook-snmp-1-sys-config\"" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.044312 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"prometheus-webhook-snmp-1-ca\"" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.121453 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.121521 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/dc017a91-5056-42fd-8d59-57baacaf0c14-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.121557 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/dc017a91-5056-42fd-8d59-57baacaf0c14-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.121582 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.121602 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/dc017a91-5056-42fd-8d59-57baacaf0c14-builder-dockercfg-qv5f4-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.121624 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.121643 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/dc017a91-5056-42fd-8d59-57baacaf0c14-builder-dockercfg-qv5f4-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.121688 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.121851 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.121868 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crmgq\" (UniqueName: \"kubernetes.io/projected/dc017a91-5056-42fd-8d59-57baacaf0c14-kube-api-access-crmgq\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.121893 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.121911 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.223129 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.223232 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/dc017a91-5056-42fd-8d59-57baacaf0c14-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.223297 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/dc017a91-5056-42fd-8d59-57baacaf0c14-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.223347 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.223461 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/dc017a91-5056-42fd-8d59-57baacaf0c14-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.223506 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/dc017a91-5056-42fd-8d59-57baacaf0c14-builder-dockercfg-qv5f4-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.223531 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/dc017a91-5056-42fd-8d59-57baacaf0c14-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.223585 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.223638 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/dc017a91-5056-42fd-8d59-57baacaf0c14-builder-dockercfg-qv5f4-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.223679 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.223733 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.224054 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crmgq\" (UniqueName: \"kubernetes.io/projected/dc017a91-5056-42fd-8d59-57baacaf0c14-kube-api-access-crmgq\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.224137 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.224232 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.224808 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.224877 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.225051 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.225077 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.225313 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.225585 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.225950 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.233737 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/dc017a91-5056-42fd-8d59-57baacaf0c14-builder-dockercfg-qv5f4-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.240034 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/dc017a91-5056-42fd-8d59-57baacaf0c14-builder-dockercfg-qv5f4-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.249818 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-crmgq\" (UniqueName: \"kubernetes.io/projected/dc017a91-5056-42fd-8d59-57baacaf0c14-kube-api-access-crmgq\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.358103 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.793228 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.798569 5116 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 22 00:29:35 crc kubenswrapper[5116]: I0322 00:29:35.489222 5116 generic.go:358] "Generic (PLEG): container finished" podID="dc017a91-5056-42fd-8d59-57baacaf0c14" containerID="1c8ebaeda44b7a3dff7bb55c721d652206b4b163ee6d5e5d58e60088384434fa" exitCode=0 Mar 22 00:29:35 crc kubenswrapper[5116]: I0322 00:29:35.489465 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"dc017a91-5056-42fd-8d59-57baacaf0c14","Type":"ContainerDied","Data":"1c8ebaeda44b7a3dff7bb55c721d652206b4b163ee6d5e5d58e60088384434fa"} Mar 22 00:29:35 crc kubenswrapper[5116]: I0322 00:29:35.490419 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"dc017a91-5056-42fd-8d59-57baacaf0c14","Type":"ContainerStarted","Data":"42db315b2f8931ff27b8e8122f743c37cd2891b092b07d7e09f6d9b04be78a9b"} Mar 22 00:29:36 crc kubenswrapper[5116]: I0322 00:29:36.499700 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"dc017a91-5056-42fd-8d59-57baacaf0c14","Type":"ContainerStarted","Data":"cc2e1a43efd40a8a4ff5cc7bf6fe4c4bdfeadaa488aef4e52d401e6f183cd39d"} Mar 22 00:29:36 crc kubenswrapper[5116]: I0322 00:29:36.522883 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-1-build" podStartSLOduration=3.522869772 podStartE2EDuration="3.522869772s" podCreationTimestamp="2026-03-22 00:29:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:29:36.522611284 +0000 UTC m=+1247.544912667" watchObservedRunningTime="2026-03-22 00:29:36.522869772 +0000 UTC m=+1247.545171145" Mar 22 00:29:44 crc kubenswrapper[5116]: I0322 00:29:44.893096 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 22 00:29:44 crc kubenswrapper[5116]: I0322 00:29:44.893671 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="service-telemetry/prometheus-webhook-snmp-1-build" podUID="dc017a91-5056-42fd-8d59-57baacaf0c14" containerName="docker-build" containerID="cri-o://cc2e1a43efd40a8a4ff5cc7bf6fe4c4bdfeadaa488aef4e52d401e6f183cd39d" gracePeriod=30 Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.736515 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_dc017a91-5056-42fd-8d59-57baacaf0c14/docker-build/0.log" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.737333 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.752863 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-container-storage-run\") pod \"dc017a91-5056-42fd-8d59-57baacaf0c14\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.752918 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-system-configs\") pod \"dc017a91-5056-42fd-8d59-57baacaf0c14\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.752949 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-build-blob-cache\") pod \"dc017a91-5056-42fd-8d59-57baacaf0c14\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.752973 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-container-storage-root\") pod \"dc017a91-5056-42fd-8d59-57baacaf0c14\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.752997 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/dc017a91-5056-42fd-8d59-57baacaf0c14-builder-dockercfg-qv5f4-pull\") pod \"dc017a91-5056-42fd-8d59-57baacaf0c14\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.753032 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/dc017a91-5056-42fd-8d59-57baacaf0c14-buildcachedir\") pod \"dc017a91-5056-42fd-8d59-57baacaf0c14\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.753058 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/dc017a91-5056-42fd-8d59-57baacaf0c14-builder-dockercfg-qv5f4-push\") pod \"dc017a91-5056-42fd-8d59-57baacaf0c14\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.753085 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-buildworkdir\") pod \"dc017a91-5056-42fd-8d59-57baacaf0c14\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.753106 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/dc017a91-5056-42fd-8d59-57baacaf0c14-node-pullsecrets\") pod \"dc017a91-5056-42fd-8d59-57baacaf0c14\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.753189 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crmgq\" (UniqueName: \"kubernetes.io/projected/dc017a91-5056-42fd-8d59-57baacaf0c14-kube-api-access-crmgq\") pod \"dc017a91-5056-42fd-8d59-57baacaf0c14\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.753214 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-proxy-ca-bundles\") pod \"dc017a91-5056-42fd-8d59-57baacaf0c14\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.753240 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-ca-bundles\") pod \"dc017a91-5056-42fd-8d59-57baacaf0c14\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.755142 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "dc017a91-5056-42fd-8d59-57baacaf0c14" (UID: "dc017a91-5056-42fd-8d59-57baacaf0c14"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.755779 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "dc017a91-5056-42fd-8d59-57baacaf0c14" (UID: "dc017a91-5056-42fd-8d59-57baacaf0c14"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.756541 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "dc017a91-5056-42fd-8d59-57baacaf0c14" (UID: "dc017a91-5056-42fd-8d59-57baacaf0c14"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.757477 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "dc017a91-5056-42fd-8d59-57baacaf0c14" (UID: "dc017a91-5056-42fd-8d59-57baacaf0c14"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.757535 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc017a91-5056-42fd-8d59-57baacaf0c14-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "dc017a91-5056-42fd-8d59-57baacaf0c14" (UID: "dc017a91-5056-42fd-8d59-57baacaf0c14"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.757564 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc017a91-5056-42fd-8d59-57baacaf0c14-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "dc017a91-5056-42fd-8d59-57baacaf0c14" (UID: "dc017a91-5056-42fd-8d59-57baacaf0c14"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.762082 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "dc017a91-5056-42fd-8d59-57baacaf0c14" (UID: "dc017a91-5056-42fd-8d59-57baacaf0c14"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.762846 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc017a91-5056-42fd-8d59-57baacaf0c14-builder-dockercfg-qv5f4-push" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-push") pod "dc017a91-5056-42fd-8d59-57baacaf0c14" (UID: "dc017a91-5056-42fd-8d59-57baacaf0c14"). InnerVolumeSpecName "builder-dockercfg-qv5f4-push". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.764049 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc017a91-5056-42fd-8d59-57baacaf0c14-kube-api-access-crmgq" (OuterVolumeSpecName: "kube-api-access-crmgq") pod "dc017a91-5056-42fd-8d59-57baacaf0c14" (UID: "dc017a91-5056-42fd-8d59-57baacaf0c14"). InnerVolumeSpecName "kube-api-access-crmgq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.764066 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc017a91-5056-42fd-8d59-57baacaf0c14-builder-dockercfg-qv5f4-pull" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-pull") pod "dc017a91-5056-42fd-8d59-57baacaf0c14" (UID: "dc017a91-5056-42fd-8d59-57baacaf0c14"). InnerVolumeSpecName "builder-dockercfg-qv5f4-pull". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.813494 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "dc017a91-5056-42fd-8d59-57baacaf0c14" (UID: "dc017a91-5056-42fd-8d59-57baacaf0c14"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.846079 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_dc017a91-5056-42fd-8d59-57baacaf0c14/docker-build/0.log" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.846569 5116 generic.go:358] "Generic (PLEG): container finished" podID="dc017a91-5056-42fd-8d59-57baacaf0c14" containerID="cc2e1a43efd40a8a4ff5cc7bf6fe4c4bdfeadaa488aef4e52d401e6f183cd39d" exitCode=1 Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.846622 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"dc017a91-5056-42fd-8d59-57baacaf0c14","Type":"ContainerDied","Data":"cc2e1a43efd40a8a4ff5cc7bf6fe4c4bdfeadaa488aef4e52d401e6f183cd39d"} Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.846686 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"dc017a91-5056-42fd-8d59-57baacaf0c14","Type":"ContainerDied","Data":"42db315b2f8931ff27b8e8122f743c37cd2891b092b07d7e09f6d9b04be78a9b"} Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.846707 5116 scope.go:117] "RemoveContainer" containerID="cc2e1a43efd40a8a4ff5cc7bf6fe4c4bdfeadaa488aef4e52d401e6f183cd39d" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.846646 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.855574 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.855607 5116 reconciler_common.go:299] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.855619 5116 reconciler_common.go:299] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.855631 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/dc017a91-5056-42fd-8d59-57baacaf0c14-builder-dockercfg-qv5f4-pull\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.855643 5116 reconciler_common.go:299] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/dc017a91-5056-42fd-8d59-57baacaf0c14-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.855653 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/dc017a91-5056-42fd-8d59-57baacaf0c14-builder-dockercfg-qv5f4-push\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.855667 5116 reconciler_common.go:299] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.855678 5116 reconciler_common.go:299] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/dc017a91-5056-42fd-8d59-57baacaf0c14-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.855689 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-crmgq\" (UniqueName: \"kubernetes.io/projected/dc017a91-5056-42fd-8d59-57baacaf0c14-kube-api-access-crmgq\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.855701 5116 reconciler_common.go:299] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.855711 5116 reconciler_common.go:299] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.880726 5116 scope.go:117] "RemoveContainer" containerID="1c8ebaeda44b7a3dff7bb55c721d652206b4b163ee6d5e5d58e60088384434fa" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.940161 5116 scope.go:117] "RemoveContainer" containerID="cc2e1a43efd40a8a4ff5cc7bf6fe4c4bdfeadaa488aef4e52d401e6f183cd39d" Mar 22 00:29:45 crc kubenswrapper[5116]: E0322 00:29:45.940609 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc2e1a43efd40a8a4ff5cc7bf6fe4c4bdfeadaa488aef4e52d401e6f183cd39d\": container with ID starting with cc2e1a43efd40a8a4ff5cc7bf6fe4c4bdfeadaa488aef4e52d401e6f183cd39d not found: ID does not exist" containerID="cc2e1a43efd40a8a4ff5cc7bf6fe4c4bdfeadaa488aef4e52d401e6f183cd39d" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.940650 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc2e1a43efd40a8a4ff5cc7bf6fe4c4bdfeadaa488aef4e52d401e6f183cd39d"} err="failed to get container status \"cc2e1a43efd40a8a4ff5cc7bf6fe4c4bdfeadaa488aef4e52d401e6f183cd39d\": rpc error: code = NotFound desc = could not find container \"cc2e1a43efd40a8a4ff5cc7bf6fe4c4bdfeadaa488aef4e52d401e6f183cd39d\": container with ID starting with cc2e1a43efd40a8a4ff5cc7bf6fe4c4bdfeadaa488aef4e52d401e6f183cd39d not found: ID does not exist" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.940679 5116 scope.go:117] "RemoveContainer" containerID="1c8ebaeda44b7a3dff7bb55c721d652206b4b163ee6d5e5d58e60088384434fa" Mar 22 00:29:45 crc kubenswrapper[5116]: E0322 00:29:45.941021 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c8ebaeda44b7a3dff7bb55c721d652206b4b163ee6d5e5d58e60088384434fa\": container with ID starting with 1c8ebaeda44b7a3dff7bb55c721d652206b4b163ee6d5e5d58e60088384434fa not found: ID does not exist" containerID="1c8ebaeda44b7a3dff7bb55c721d652206b4b163ee6d5e5d58e60088384434fa" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.941044 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c8ebaeda44b7a3dff7bb55c721d652206b4b163ee6d5e5d58e60088384434fa"} err="failed to get container status \"1c8ebaeda44b7a3dff7bb55c721d652206b4b163ee6d5e5d58e60088384434fa\": rpc error: code = NotFound desc = could not find container \"1c8ebaeda44b7a3dff7bb55c721d652206b4b163ee6d5e5d58e60088384434fa\": container with ID starting with 1c8ebaeda44b7a3dff7bb55c721d652206b4b163ee6d5e5d58e60088384434fa not found: ID does not exist" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.080115 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "dc017a91-5056-42fd-8d59-57baacaf0c14" (UID: "dc017a91-5056-42fd-8d59-57baacaf0c14"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.159284 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.189337 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.195739 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.578294 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.579399 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dc017a91-5056-42fd-8d59-57baacaf0c14" containerName="docker-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.579428 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc017a91-5056-42fd-8d59-57baacaf0c14" containerName="docker-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.579514 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dc017a91-5056-42fd-8d59-57baacaf0c14" containerName="manage-dockerfile" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.579525 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc017a91-5056-42fd-8d59-57baacaf0c14" containerName="manage-dockerfile" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.579746 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="dc017a91-5056-42fd-8d59-57baacaf0c14" containerName="docker-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.626987 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.627070 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.628925 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"prometheus-webhook-snmp-2-global-ca\"" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.628976 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"prometheus-webhook-snmp-2-ca\"" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.629239 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"prometheus-webhook-snmp-2-sys-config\"" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.630196 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"builder-dockercfg-qv5f4\"" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.665085 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.665482 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/80323414-c785-4e29-ac99-d15e78a522e6-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.665517 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65nzh\" (UniqueName: \"kubernetes.io/projected/80323414-c785-4e29-ac99-d15e78a522e6-kube-api-access-65nzh\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.665573 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/80323414-c785-4e29-ac99-d15e78a522e6-builder-dockercfg-qv5f4-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.665605 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.665634 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.665667 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.665708 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/80323414-c785-4e29-ac99-d15e78a522e6-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.665736 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.665799 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.665825 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/80323414-c785-4e29-ac99-d15e78a522e6-builder-dockercfg-qv5f4-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.665860 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.767588 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/80323414-c785-4e29-ac99-d15e78a522e6-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.767631 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.767670 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.767691 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/80323414-c785-4e29-ac99-d15e78a522e6-builder-dockercfg-qv5f4-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.767723 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.767748 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.767756 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/80323414-c785-4e29-ac99-d15e78a522e6-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.767824 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/80323414-c785-4e29-ac99-d15e78a522e6-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.767770 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/80323414-c785-4e29-ac99-d15e78a522e6-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.767885 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-65nzh\" (UniqueName: \"kubernetes.io/projected/80323414-c785-4e29-ac99-d15e78a522e6-kube-api-access-65nzh\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.767921 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/80323414-c785-4e29-ac99-d15e78a522e6-builder-dockercfg-qv5f4-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.767960 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.768001 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.768036 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.768309 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.768538 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.768626 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.768701 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.769331 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.769850 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.770249 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.775346 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/80323414-c785-4e29-ac99-d15e78a522e6-builder-dockercfg-qv5f4-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.779466 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/80323414-c785-4e29-ac99-d15e78a522e6-builder-dockercfg-qv5f4-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.793150 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-65nzh\" (UniqueName: \"kubernetes.io/projected/80323414-c785-4e29-ac99-d15e78a522e6-kube-api-access-65nzh\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.950106 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:47 crc kubenswrapper[5116]: I0322 00:29:47.140418 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 22 00:29:47 crc kubenswrapper[5116]: I0322 00:29:47.709630 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc017a91-5056-42fd-8d59-57baacaf0c14" path="/var/lib/kubelet/pods/dc017a91-5056-42fd-8d59-57baacaf0c14/volumes" Mar 22 00:29:47 crc kubenswrapper[5116]: I0322 00:29:47.866656 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"80323414-c785-4e29-ac99-d15e78a522e6","Type":"ContainerStarted","Data":"9f1455b4e1d683b8147a4e7cbe39bfd2b64e3db289d053d240e16d814779b540"} Mar 22 00:29:47 crc kubenswrapper[5116]: I0322 00:29:47.866743 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"80323414-c785-4e29-ac99-d15e78a522e6","Type":"ContainerStarted","Data":"a8d14e8cd61c30324bd428b5c66778dfcef4610119afccd6d0a42381bcfec935"} Mar 22 00:29:48 crc kubenswrapper[5116]: I0322 00:29:48.872857 5116 generic.go:358] "Generic (PLEG): container finished" podID="80323414-c785-4e29-ac99-d15e78a522e6" containerID="9f1455b4e1d683b8147a4e7cbe39bfd2b64e3db289d053d240e16d814779b540" exitCode=0 Mar 22 00:29:48 crc kubenswrapper[5116]: I0322 00:29:48.872915 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"80323414-c785-4e29-ac99-d15e78a522e6","Type":"ContainerDied","Data":"9f1455b4e1d683b8147a4e7cbe39bfd2b64e3db289d053d240e16d814779b540"} Mar 22 00:29:49 crc kubenswrapper[5116]: I0322 00:29:49.883193 5116 generic.go:358] "Generic (PLEG): container finished" podID="80323414-c785-4e29-ac99-d15e78a522e6" containerID="cc649ae09f5f63aa5638af626fb6ba6a9bce42b912741ed45d606cb5dadab8f7" exitCode=0 Mar 22 00:29:49 crc kubenswrapper[5116]: I0322 00:29:49.883308 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"80323414-c785-4e29-ac99-d15e78a522e6","Type":"ContainerDied","Data":"cc649ae09f5f63aa5638af626fb6ba6a9bce42b912741ed45d606cb5dadab8f7"} Mar 22 00:29:49 crc kubenswrapper[5116]: I0322 00:29:49.950796 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-2-build_80323414-c785-4e29-ac99-d15e78a522e6/manage-dockerfile/0.log" Mar 22 00:29:50 crc kubenswrapper[5116]: I0322 00:29:50.892543 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"80323414-c785-4e29-ac99-d15e78a522e6","Type":"ContainerStarted","Data":"cc2dd2bd84c435afd43ea278dfbf8c6acf00d8d304d7a3b1991c1034d5caed81"} Mar 22 00:29:50 crc kubenswrapper[5116]: I0322 00:29:50.915357 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-2-build" podStartSLOduration=4.915336728 podStartE2EDuration="4.915336728s" podCreationTimestamp="2026-03-22 00:29:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:29:50.914719279 +0000 UTC m=+1261.937020662" watchObservedRunningTime="2026-03-22 00:29:50.915336728 +0000 UTC m=+1261.937638101" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.129735 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29568990-vx25q"] Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.136735 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5"] Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.136913 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568990-vx25q" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.139702 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-zsw2q\"" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.141471 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.143515 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5"] Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.144051 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-config\"" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.144389 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-dockercfg-vfqp6\"" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.146772 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.146961 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.154333 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568990-vx25q"] Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.272196 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18c0d10d-b2a7-4649-a456-658425b37334-config-volume\") pod \"collect-profiles-29568990-5fml5\" (UID: \"18c0d10d-b2a7-4649-a456-658425b37334\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.272299 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7km4\" (UniqueName: \"kubernetes.io/projected/18c0d10d-b2a7-4649-a456-658425b37334-kube-api-access-h7km4\") pod \"collect-profiles-29568990-5fml5\" (UID: \"18c0d10d-b2a7-4649-a456-658425b37334\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.272503 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ww6v\" (UniqueName: \"kubernetes.io/projected/753649c7-f19a-4b90-a29a-2108a691e934-kube-api-access-6ww6v\") pod \"auto-csr-approver-29568990-vx25q\" (UID: \"753649c7-f19a-4b90-a29a-2108a691e934\") " pod="openshift-infra/auto-csr-approver-29568990-vx25q" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.272663 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/18c0d10d-b2a7-4649-a456-658425b37334-secret-volume\") pod \"collect-profiles-29568990-5fml5\" (UID: \"18c0d10d-b2a7-4649-a456-658425b37334\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.373474 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7km4\" (UniqueName: \"kubernetes.io/projected/18c0d10d-b2a7-4649-a456-658425b37334-kube-api-access-h7km4\") pod \"collect-profiles-29568990-5fml5\" (UID: \"18c0d10d-b2a7-4649-a456-658425b37334\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.373569 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ww6v\" (UniqueName: \"kubernetes.io/projected/753649c7-f19a-4b90-a29a-2108a691e934-kube-api-access-6ww6v\") pod \"auto-csr-approver-29568990-vx25q\" (UID: \"753649c7-f19a-4b90-a29a-2108a691e934\") " pod="openshift-infra/auto-csr-approver-29568990-vx25q" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.373611 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/18c0d10d-b2a7-4649-a456-658425b37334-secret-volume\") pod \"collect-profiles-29568990-5fml5\" (UID: \"18c0d10d-b2a7-4649-a456-658425b37334\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.373654 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18c0d10d-b2a7-4649-a456-658425b37334-config-volume\") pod \"collect-profiles-29568990-5fml5\" (UID: \"18c0d10d-b2a7-4649-a456-658425b37334\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.375218 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18c0d10d-b2a7-4649-a456-658425b37334-config-volume\") pod \"collect-profiles-29568990-5fml5\" (UID: \"18c0d10d-b2a7-4649-a456-658425b37334\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.387364 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/18c0d10d-b2a7-4649-a456-658425b37334-secret-volume\") pod \"collect-profiles-29568990-5fml5\" (UID: \"18c0d10d-b2a7-4649-a456-658425b37334\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.394016 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ww6v\" (UniqueName: \"kubernetes.io/projected/753649c7-f19a-4b90-a29a-2108a691e934-kube-api-access-6ww6v\") pod \"auto-csr-approver-29568990-vx25q\" (UID: \"753649c7-f19a-4b90-a29a-2108a691e934\") " pod="openshift-infra/auto-csr-approver-29568990-vx25q" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.403699 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7km4\" (UniqueName: \"kubernetes.io/projected/18c0d10d-b2a7-4649-a456-658425b37334-kube-api-access-h7km4\") pod \"collect-profiles-29568990-5fml5\" (UID: \"18c0d10d-b2a7-4649-a456-658425b37334\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.458864 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568990-vx25q" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.474963 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.681261 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5"] Mar 22 00:30:00 crc kubenswrapper[5116]: W0322 00:30:00.708512 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18c0d10d_b2a7_4649_a456_658425b37334.slice/crio-883ffff50b87983f260eca59092d5da7720f8d91e5a851c02748a363864cc02a WatchSource:0}: Error finding container 883ffff50b87983f260eca59092d5da7720f8d91e5a851c02748a363864cc02a: Status 404 returned error can't find the container with id 883ffff50b87983f260eca59092d5da7720f8d91e5a851c02748a363864cc02a Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.931069 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568990-vx25q"] Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.972773 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568990-vx25q" event={"ID":"753649c7-f19a-4b90-a29a-2108a691e934","Type":"ContainerStarted","Data":"3a1396751c56c487fc0af757cd9cc0665099170218de35e0c80420c8d61d1ed1"} Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.974047 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" event={"ID":"18c0d10d-b2a7-4649-a456-658425b37334","Type":"ContainerStarted","Data":"9fc9693d6cf3e606702f2ea7cd0f1963e72f90585a2a3981d32b3514955061d7"} Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.974135 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" event={"ID":"18c0d10d-b2a7-4649-a456-658425b37334","Type":"ContainerStarted","Data":"883ffff50b87983f260eca59092d5da7720f8d91e5a851c02748a363864cc02a"} Mar 22 00:30:01 crc kubenswrapper[5116]: I0322 00:30:01.000401 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" podStartSLOduration=1.000376021 podStartE2EDuration="1.000376021s" podCreationTimestamp="2026-03-22 00:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:30:00.99620662 +0000 UTC m=+1272.018508023" watchObservedRunningTime="2026-03-22 00:30:01.000376021 +0000 UTC m=+1272.022677424" Mar 22 00:30:01 crc kubenswrapper[5116]: I0322 00:30:01.985234 5116 generic.go:358] "Generic (PLEG): container finished" podID="18c0d10d-b2a7-4649-a456-658425b37334" containerID="9fc9693d6cf3e606702f2ea7cd0f1963e72f90585a2a3981d32b3514955061d7" exitCode=0 Mar 22 00:30:01 crc kubenswrapper[5116]: I0322 00:30:01.985398 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" event={"ID":"18c0d10d-b2a7-4649-a456-658425b37334","Type":"ContainerDied","Data":"9fc9693d6cf3e606702f2ea7cd0f1963e72f90585a2a3981d32b3514955061d7"} Mar 22 00:30:03 crc kubenswrapper[5116]: I0322 00:30:03.290656 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" Mar 22 00:30:03 crc kubenswrapper[5116]: I0322 00:30:03.414590 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18c0d10d-b2a7-4649-a456-658425b37334-config-volume\") pod \"18c0d10d-b2a7-4649-a456-658425b37334\" (UID: \"18c0d10d-b2a7-4649-a456-658425b37334\") " Mar 22 00:30:03 crc kubenswrapper[5116]: I0322 00:30:03.414687 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/18c0d10d-b2a7-4649-a456-658425b37334-secret-volume\") pod \"18c0d10d-b2a7-4649-a456-658425b37334\" (UID: \"18c0d10d-b2a7-4649-a456-658425b37334\") " Mar 22 00:30:03 crc kubenswrapper[5116]: I0322 00:30:03.415363 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7km4\" (UniqueName: \"kubernetes.io/projected/18c0d10d-b2a7-4649-a456-658425b37334-kube-api-access-h7km4\") pod \"18c0d10d-b2a7-4649-a456-658425b37334\" (UID: \"18c0d10d-b2a7-4649-a456-658425b37334\") " Mar 22 00:30:03 crc kubenswrapper[5116]: I0322 00:30:03.416788 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18c0d10d-b2a7-4649-a456-658425b37334-config-volume" (OuterVolumeSpecName: "config-volume") pod "18c0d10d-b2a7-4649-a456-658425b37334" (UID: "18c0d10d-b2a7-4649-a456-658425b37334"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:30:03 crc kubenswrapper[5116]: I0322 00:30:03.422263 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18c0d10d-b2a7-4649-a456-658425b37334-kube-api-access-h7km4" (OuterVolumeSpecName: "kube-api-access-h7km4") pod "18c0d10d-b2a7-4649-a456-658425b37334" (UID: "18c0d10d-b2a7-4649-a456-658425b37334"). InnerVolumeSpecName "kube-api-access-h7km4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:30:03 crc kubenswrapper[5116]: I0322 00:30:03.426076 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18c0d10d-b2a7-4649-a456-658425b37334-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "18c0d10d-b2a7-4649-a456-658425b37334" (UID: "18c0d10d-b2a7-4649-a456-658425b37334"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:30:03 crc kubenswrapper[5116]: I0322 00:30:03.517818 5116 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18c0d10d-b2a7-4649-a456-658425b37334-config-volume\") on node \"crc\" DevicePath \"\"" Mar 22 00:30:03 crc kubenswrapper[5116]: I0322 00:30:03.517857 5116 reconciler_common.go:299] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/18c0d10d-b2a7-4649-a456-658425b37334-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 22 00:30:03 crc kubenswrapper[5116]: I0322 00:30:03.517869 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h7km4\" (UniqueName: \"kubernetes.io/projected/18c0d10d-b2a7-4649-a456-658425b37334-kube-api-access-h7km4\") on node \"crc\" DevicePath \"\"" Mar 22 00:30:04 crc kubenswrapper[5116]: I0322 00:30:04.001428 5116 generic.go:358] "Generic (PLEG): container finished" podID="753649c7-f19a-4b90-a29a-2108a691e934" containerID="85bd7d245d43142b219c3be5ac468eafb8ba3e6e6a34155393343c620d8b140b" exitCode=0 Mar 22 00:30:04 crc kubenswrapper[5116]: I0322 00:30:04.001540 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568990-vx25q" event={"ID":"753649c7-f19a-4b90-a29a-2108a691e934","Type":"ContainerDied","Data":"85bd7d245d43142b219c3be5ac468eafb8ba3e6e6a34155393343c620d8b140b"} Mar 22 00:30:04 crc kubenswrapper[5116]: I0322 00:30:04.003866 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" event={"ID":"18c0d10d-b2a7-4649-a456-658425b37334","Type":"ContainerDied","Data":"883ffff50b87983f260eca59092d5da7720f8d91e5a851c02748a363864cc02a"} Mar 22 00:30:04 crc kubenswrapper[5116]: I0322 00:30:04.003944 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="883ffff50b87983f260eca59092d5da7720f8d91e5a851c02748a363864cc02a" Mar 22 00:30:04 crc kubenswrapper[5116]: I0322 00:30:04.004078 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" Mar 22 00:30:05 crc kubenswrapper[5116]: I0322 00:30:05.275299 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568990-vx25q" Mar 22 00:30:05 crc kubenswrapper[5116]: I0322 00:30:05.338260 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ww6v\" (UniqueName: \"kubernetes.io/projected/753649c7-f19a-4b90-a29a-2108a691e934-kube-api-access-6ww6v\") pod \"753649c7-f19a-4b90-a29a-2108a691e934\" (UID: \"753649c7-f19a-4b90-a29a-2108a691e934\") " Mar 22 00:30:05 crc kubenswrapper[5116]: I0322 00:30:05.348251 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/753649c7-f19a-4b90-a29a-2108a691e934-kube-api-access-6ww6v" (OuterVolumeSpecName: "kube-api-access-6ww6v") pod "753649c7-f19a-4b90-a29a-2108a691e934" (UID: "753649c7-f19a-4b90-a29a-2108a691e934"). InnerVolumeSpecName "kube-api-access-6ww6v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:30:05 crc kubenswrapper[5116]: I0322 00:30:05.439406 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6ww6v\" (UniqueName: \"kubernetes.io/projected/753649c7-f19a-4b90-a29a-2108a691e934-kube-api-access-6ww6v\") on node \"crc\" DevicePath \"\"" Mar 22 00:30:06 crc kubenswrapper[5116]: I0322 00:30:06.017367 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568990-vx25q" event={"ID":"753649c7-f19a-4b90-a29a-2108a691e934","Type":"ContainerDied","Data":"3a1396751c56c487fc0af757cd9cc0665099170218de35e0c80420c8d61d1ed1"} Mar 22 00:30:06 crc kubenswrapper[5116]: I0322 00:30:06.018076 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a1396751c56c487fc0af757cd9cc0665099170218de35e0c80420c8d61d1ed1" Mar 22 00:30:06 crc kubenswrapper[5116]: I0322 00:30:06.017626 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568990-vx25q" Mar 22 00:30:06 crc kubenswrapper[5116]: I0322 00:30:06.338313 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29568984-7vpl2"] Mar 22 00:30:06 crc kubenswrapper[5116]: I0322 00:30:06.343261 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29568984-7vpl2"] Mar 22 00:30:07 crc kubenswrapper[5116]: I0322 00:30:07.706650 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd880bf8-6058-4924-8268-c4cdcd44bdcf" path="/var/lib/kubelet/pods/dd880bf8-6058-4924-8268-c4cdcd44bdcf/volumes" Mar 22 00:30:23 crc kubenswrapper[5116]: I0322 00:30:23.057244 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:30:23 crc kubenswrapper[5116]: I0322 00:30:23.057860 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:30:48 crc kubenswrapper[5116]: I0322 00:30:48.532596 5116 generic.go:358] "Generic (PLEG): container finished" podID="80323414-c785-4e29-ac99-d15e78a522e6" containerID="cc2dd2bd84c435afd43ea278dfbf8c6acf00d8d304d7a3b1991c1034d5caed81" exitCode=0 Mar 22 00:30:48 crc kubenswrapper[5116]: I0322 00:30:48.532646 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"80323414-c785-4e29-ac99-d15e78a522e6","Type":"ContainerDied","Data":"cc2dd2bd84c435afd43ea278dfbf8c6acf00d8d304d7a3b1991c1034d5caed81"} Mar 22 00:30:49 crc kubenswrapper[5116]: I0322 00:30:49.842880 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.017898 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-proxy-ca-bundles\") pod \"80323414-c785-4e29-ac99-d15e78a522e6\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.017976 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-container-storage-run\") pod \"80323414-c785-4e29-ac99-d15e78a522e6\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.018012 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/80323414-c785-4e29-ac99-d15e78a522e6-builder-dockercfg-qv5f4-pull\") pod \"80323414-c785-4e29-ac99-d15e78a522e6\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.018034 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-container-storage-root\") pod \"80323414-c785-4e29-ac99-d15e78a522e6\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.018097 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/80323414-c785-4e29-ac99-d15e78a522e6-buildcachedir\") pod \"80323414-c785-4e29-ac99-d15e78a522e6\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.018143 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-buildworkdir\") pod \"80323414-c785-4e29-ac99-d15e78a522e6\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.018242 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-ca-bundles\") pod \"80323414-c785-4e29-ac99-d15e78a522e6\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.018326 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-system-configs\") pod \"80323414-c785-4e29-ac99-d15e78a522e6\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.018355 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65nzh\" (UniqueName: \"kubernetes.io/projected/80323414-c785-4e29-ac99-d15e78a522e6-kube-api-access-65nzh\") pod \"80323414-c785-4e29-ac99-d15e78a522e6\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.018379 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/80323414-c785-4e29-ac99-d15e78a522e6-node-pullsecrets\") pod \"80323414-c785-4e29-ac99-d15e78a522e6\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.018406 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-build-blob-cache\") pod \"80323414-c785-4e29-ac99-d15e78a522e6\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.018473 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/80323414-c785-4e29-ac99-d15e78a522e6-builder-dockercfg-qv5f4-push\") pod \"80323414-c785-4e29-ac99-d15e78a522e6\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.018534 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/80323414-c785-4e29-ac99-d15e78a522e6-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "80323414-c785-4e29-ac99-d15e78a522e6" (UID: "80323414-c785-4e29-ac99-d15e78a522e6"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.018597 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/80323414-c785-4e29-ac99-d15e78a522e6-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "80323414-c785-4e29-ac99-d15e78a522e6" (UID: "80323414-c785-4e29-ac99-d15e78a522e6"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.019013 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "80323414-c785-4e29-ac99-d15e78a522e6" (UID: "80323414-c785-4e29-ac99-d15e78a522e6"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.019037 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "80323414-c785-4e29-ac99-d15e78a522e6" (UID: "80323414-c785-4e29-ac99-d15e78a522e6"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.019278 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "80323414-c785-4e29-ac99-d15e78a522e6" (UID: "80323414-c785-4e29-ac99-d15e78a522e6"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.019681 5116 reconciler_common.go:299] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/80323414-c785-4e29-ac99-d15e78a522e6-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.019707 5116 reconciler_common.go:299] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.019721 5116 reconciler_common.go:299] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.019735 5116 reconciler_common.go:299] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/80323414-c785-4e29-ac99-d15e78a522e6-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.019747 5116 reconciler_common.go:299] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.020004 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "80323414-c785-4e29-ac99-d15e78a522e6" (UID: "80323414-c785-4e29-ac99-d15e78a522e6"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.020610 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "80323414-c785-4e29-ac99-d15e78a522e6" (UID: "80323414-c785-4e29-ac99-d15e78a522e6"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.025099 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80323414-c785-4e29-ac99-d15e78a522e6-kube-api-access-65nzh" (OuterVolumeSpecName: "kube-api-access-65nzh") pod "80323414-c785-4e29-ac99-d15e78a522e6" (UID: "80323414-c785-4e29-ac99-d15e78a522e6"). InnerVolumeSpecName "kube-api-access-65nzh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.025334 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80323414-c785-4e29-ac99-d15e78a522e6-builder-dockercfg-qv5f4-pull" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-pull") pod "80323414-c785-4e29-ac99-d15e78a522e6" (UID: "80323414-c785-4e29-ac99-d15e78a522e6"). InnerVolumeSpecName "builder-dockercfg-qv5f4-pull". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.025572 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80323414-c785-4e29-ac99-d15e78a522e6-builder-dockercfg-qv5f4-push" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-push") pod "80323414-c785-4e29-ac99-d15e78a522e6" (UID: "80323414-c785-4e29-ac99-d15e78a522e6"). InnerVolumeSpecName "builder-dockercfg-qv5f4-push". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.120581 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.120620 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/80323414-c785-4e29-ac99-d15e78a522e6-builder-dockercfg-qv5f4-pull\") on node \"crc\" DevicePath \"\"" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.120629 5116 reconciler_common.go:299] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.120637 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-65nzh\" (UniqueName: \"kubernetes.io/projected/80323414-c785-4e29-ac99-d15e78a522e6-kube-api-access-65nzh\") on node \"crc\" DevicePath \"\"" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.120648 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/80323414-c785-4e29-ac99-d15e78a522e6-builder-dockercfg-qv5f4-push\") on node \"crc\" DevicePath \"\"" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.154382 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "80323414-c785-4e29-ac99-d15e78a522e6" (UID: "80323414-c785-4e29-ac99-d15e78a522e6"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.222357 5116 reconciler_common.go:299] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.552180 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.552193 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"80323414-c785-4e29-ac99-d15e78a522e6","Type":"ContainerDied","Data":"a8d14e8cd61c30324bd428b5c66778dfcef4610119afccd6d0a42381bcfec935"} Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.552227 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8d14e8cd61c30324bd428b5c66778dfcef4610119afccd6d0a42381bcfec935" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.953492 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "80323414-c785-4e29-ac99-d15e78a522e6" (UID: "80323414-c785-4e29-ac99-d15e78a522e6"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.992016 5116 scope.go:117] "RemoveContainer" containerID="c55d0c630755e42e331267ab717de759a85e99a7760f057cab2cc5e7dd612af4" Mar 22 00:30:51 crc kubenswrapper[5116]: I0322 00:30:51.034446 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 22 00:30:53 crc kubenswrapper[5116]: I0322 00:30:53.057772 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:30:53 crc kubenswrapper[5116]: I0322 00:30:53.057854 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.073386 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.074392 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80323414-c785-4e29-ac99-d15e78a522e6" containerName="git-clone" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.074404 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="80323414-c785-4e29-ac99-d15e78a522e6" containerName="git-clone" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.074422 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80323414-c785-4e29-ac99-d15e78a522e6" containerName="manage-dockerfile" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.074429 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="80323414-c785-4e29-ac99-d15e78a522e6" containerName="manage-dockerfile" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.074469 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="753649c7-f19a-4b90-a29a-2108a691e934" containerName="oc" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.074475 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="753649c7-f19a-4b90-a29a-2108a691e934" containerName="oc" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.074498 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="18c0d10d-b2a7-4649-a456-658425b37334" containerName="collect-profiles" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.074507 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="18c0d10d-b2a7-4649-a456-658425b37334" containerName="collect-profiles" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.074517 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80323414-c785-4e29-ac99-d15e78a522e6" containerName="docker-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.074523 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="80323414-c785-4e29-ac99-d15e78a522e6" containerName="docker-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.074629 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="753649c7-f19a-4b90-a29a-2108a691e934" containerName="oc" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.074650 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="80323414-c785-4e29-ac99-d15e78a522e6" containerName="docker-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.074659 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="18c0d10d-b2a7-4649-a456-658425b37334" containerName="collect-profiles" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.203620 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.203766 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.205443 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"service-telemetry-operator-bundle-1-global-ca\"" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.205476 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"builder-dockercfg-qv5f4\"" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.206081 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"service-telemetry-operator-bundle-1-sys-config\"" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.206912 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"service-telemetry-operator-bundle-1-ca\"" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.240838 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.240878 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.240910 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.240934 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-builder-dockercfg-qv5f4-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.241026 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v96f7\" (UniqueName: \"kubernetes.io/projected/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-kube-api-access-v96f7\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.241188 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.241264 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.241379 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.241412 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.241483 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.241565 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.241610 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-builder-dockercfg-qv5f4-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.342635 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.342691 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-builder-dockercfg-qv5f4-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.342710 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v96f7\" (UniqueName: \"kubernetes.io/projected/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-kube-api-access-v96f7\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.342752 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.342805 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.342930 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.343073 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.343106 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.343120 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.343150 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.343245 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.343295 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-builder-dockercfg-qv5f4-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.343343 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.343380 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.343670 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.343766 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.343800 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.343966 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.344038 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.344128 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.344560 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.348238 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-builder-dockercfg-qv5f4-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.348618 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-builder-dockercfg-qv5f4-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.361963 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v96f7\" (UniqueName: \"kubernetes.io/projected/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-kube-api-access-v96f7\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.517945 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.717306 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 22 00:30:59 crc kubenswrapper[5116]: W0322 00:30:59.723246 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod258a2fcf_8ff5_4c12_bf18_c6a2e8d3b125.slice/crio-ddae9249247859a23357a1d34a4447a186c26fece6f5dafa5e2ee50ce8dc5bcf WatchSource:0}: Error finding container ddae9249247859a23357a1d34a4447a186c26fece6f5dafa5e2ee50ce8dc5bcf: Status 404 returned error can't find the container with id ddae9249247859a23357a1d34a4447a186c26fece6f5dafa5e2ee50ce8dc5bcf Mar 22 00:31:00 crc kubenswrapper[5116]: I0322 00:31:00.639986 5116 generic.go:358] "Generic (PLEG): container finished" podID="258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" containerID="4106dafe916c442ff6a6c84ef7c764001d5f6257e8b17fde82563cb3dcaa24f7" exitCode=0 Mar 22 00:31:00 crc kubenswrapper[5116]: I0322 00:31:00.640067 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125","Type":"ContainerDied","Data":"4106dafe916c442ff6a6c84ef7c764001d5f6257e8b17fde82563cb3dcaa24f7"} Mar 22 00:31:00 crc kubenswrapper[5116]: I0322 00:31:00.640449 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125","Type":"ContainerStarted","Data":"ddae9249247859a23357a1d34a4447a186c26fece6f5dafa5e2ee50ce8dc5bcf"} Mar 22 00:31:01 crc kubenswrapper[5116]: I0322 00:31:01.651230 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125/docker-build/0.log" Mar 22 00:31:01 crc kubenswrapper[5116]: I0322 00:31:01.651828 5116 generic.go:358] "Generic (PLEG): container finished" podID="258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" containerID="9efa61524ace63b51ff35c7e96502efaaf7a080de1173648c97977985912b667" exitCode=1 Mar 22 00:31:01 crc kubenswrapper[5116]: I0322 00:31:01.651870 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125","Type":"ContainerDied","Data":"9efa61524ace63b51ff35c7e96502efaaf7a080de1173648c97977985912b667"} Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.931706 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125/docker-build/0.log" Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.932153 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.994702 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-builder-dockercfg-qv5f4-push\") pod \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.994790 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-container-storage-root\") pod \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.994829 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-buildcachedir\") pod \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.994871 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v96f7\" (UniqueName: \"kubernetes.io/projected/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-kube-api-access-v96f7\") pod \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.994897 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-proxy-ca-bundles\") pod \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.994937 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-buildworkdir\") pod \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.994981 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-blob-cache\") pod \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.995086 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-ca-bundles\") pod \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.995119 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-node-pullsecrets\") pod \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.995195 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-container-storage-run\") pod \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.995244 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-builder-dockercfg-qv5f4-pull\") pod \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.995296 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-system-configs\") pod \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.996038 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" (UID: "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.996075 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" (UID: "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.996219 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" (UID: "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.996442 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" (UID: "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.996483 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" (UID: "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.996562 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" (UID: "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.996920 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" (UID: "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.996969 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" (UID: "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.997481 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" (UID: "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.001087 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-kube-api-access-v96f7" (OuterVolumeSpecName: "kube-api-access-v96f7") pod "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" (UID: "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125"). InnerVolumeSpecName "kube-api-access-v96f7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.001222 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-builder-dockercfg-qv5f4-pull" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-pull") pod "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" (UID: "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125"). InnerVolumeSpecName "builder-dockercfg-qv5f4-pull". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.002301 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-builder-dockercfg-qv5f4-push" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-push") pod "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" (UID: "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125"). InnerVolumeSpecName "builder-dockercfg-qv5f4-push". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.096476 5116 reconciler_common.go:299] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.096516 5116 reconciler_common.go:299] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.096525 5116 reconciler_common.go:299] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.096535 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.096548 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-builder-dockercfg-qv5f4-pull\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.096557 5116 reconciler_common.go:299] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.096565 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-builder-dockercfg-qv5f4-push\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.096573 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.096581 5116 reconciler_common.go:299] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.096591 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v96f7\" (UniqueName: \"kubernetes.io/projected/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-kube-api-access-v96f7\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.096600 5116 reconciler_common.go:299] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.096612 5116 reconciler_common.go:299] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.669460 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125/docker-build/0.log" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.670226 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.670246 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125","Type":"ContainerDied","Data":"ddae9249247859a23357a1d34a4447a186c26fece6f5dafa5e2ee50ce8dc5bcf"} Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.670285 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddae9249247859a23357a1d34a4447a186c26fece6f5dafa5e2ee50ce8dc5bcf" Mar 22 00:31:09 crc kubenswrapper[5116]: I0322 00:31:09.820858 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 22 00:31:09 crc kubenswrapper[5116]: I0322 00:31:09.831152 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.396798 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.398640 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" containerName="manage-dockerfile" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.398798 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" containerName="manage-dockerfile" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.398939 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" containerName="docker-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.399048 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" containerName="docker-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.399444 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" containerName="docker-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.409620 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.412883 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"service-telemetry-operator-bundle-2-global-ca\"" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.412893 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"service-telemetry-operator-bundle-2-ca\"" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.413102 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"service-telemetry-operator-bundle-2-sys-config\"" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.413224 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"builder-dockercfg-qv5f4\"" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.414049 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.515701 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e4b4f564-3e34-47db-a558-376d32b6d7e3-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.515767 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e4b4f564-3e34-47db-a558-376d32b6d7e3-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.515850 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/e4b4f564-3e34-47db-a558-376d32b6d7e3-builder-dockercfg-qv5f4-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.515900 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.515970 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h8km\" (UniqueName: \"kubernetes.io/projected/e4b4f564-3e34-47db-a558-376d32b6d7e3-kube-api-access-8h8km\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.516010 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.516062 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/e4b4f564-3e34-47db-a558-376d32b6d7e3-builder-dockercfg-qv5f4-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.516156 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.516223 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.516251 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.516299 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.516326 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.618143 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.618257 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e4b4f564-3e34-47db-a558-376d32b6d7e3-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.618320 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e4b4f564-3e34-47db-a558-376d32b6d7e3-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.618371 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/e4b4f564-3e34-47db-a558-376d32b6d7e3-builder-dockercfg-qv5f4-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.618404 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.618478 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8h8km\" (UniqueName: \"kubernetes.io/projected/e4b4f564-3e34-47db-a558-376d32b6d7e3-kube-api-access-8h8km\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.618519 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.618559 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/e4b4f564-3e34-47db-a558-376d32b6d7e3-builder-dockercfg-qv5f4-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.618606 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.618649 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.618688 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.618779 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.618947 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.619385 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.619582 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e4b4f564-3e34-47db-a558-376d32b6d7e3-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.619921 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.619943 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e4b4f564-3e34-47db-a558-376d32b6d7e3-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.620288 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.620451 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.622057 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.622260 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.632030 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/e4b4f564-3e34-47db-a558-376d32b6d7e3-builder-dockercfg-qv5f4-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.632556 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/e4b4f564-3e34-47db-a558-376d32b6d7e3-builder-dockercfg-qv5f4-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.651603 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h8km\" (UniqueName: \"kubernetes.io/projected/e4b4f564-3e34-47db-a558-376d32b6d7e3-kube-api-access-8h8km\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.706107 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" path="/var/lib/kubelet/pods/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125/volumes" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.733884 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:12 crc kubenswrapper[5116]: I0322 00:31:12.001500 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Mar 22 00:31:12 crc kubenswrapper[5116]: I0322 00:31:12.732354 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"e4b4f564-3e34-47db-a558-376d32b6d7e3","Type":"ContainerStarted","Data":"6dfe2256b6ccd919ea4b14f777ef296f7562bf0cb1382ff9a867530fbfd779ff"} Mar 22 00:31:12 crc kubenswrapper[5116]: I0322 00:31:12.733380 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"e4b4f564-3e34-47db-a558-376d32b6d7e3","Type":"ContainerStarted","Data":"66430cee706b285a6fea6a00a1645b2321d2adfb9b419b2649762ad243ebcf40"} Mar 22 00:31:13 crc kubenswrapper[5116]: I0322 00:31:13.740711 5116 generic.go:358] "Generic (PLEG): container finished" podID="e4b4f564-3e34-47db-a558-376d32b6d7e3" containerID="6dfe2256b6ccd919ea4b14f777ef296f7562bf0cb1382ff9a867530fbfd779ff" exitCode=0 Mar 22 00:31:13 crc kubenswrapper[5116]: I0322 00:31:13.740787 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"e4b4f564-3e34-47db-a558-376d32b6d7e3","Type":"ContainerDied","Data":"6dfe2256b6ccd919ea4b14f777ef296f7562bf0cb1382ff9a867530fbfd779ff"} Mar 22 00:31:14 crc kubenswrapper[5116]: I0322 00:31:14.749190 5116 generic.go:358] "Generic (PLEG): container finished" podID="e4b4f564-3e34-47db-a558-376d32b6d7e3" containerID="a093665aa1e085f39ea150d88b2f17d5e4d29cdbc6776804e097d4c6ab856552" exitCode=0 Mar 22 00:31:14 crc kubenswrapper[5116]: I0322 00:31:14.749249 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"e4b4f564-3e34-47db-a558-376d32b6d7e3","Type":"ContainerDied","Data":"a093665aa1e085f39ea150d88b2f17d5e4d29cdbc6776804e097d4c6ab856552"} Mar 22 00:31:14 crc kubenswrapper[5116]: I0322 00:31:14.784370 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-2-build_e4b4f564-3e34-47db-a558-376d32b6d7e3/manage-dockerfile/0.log" Mar 22 00:31:15 crc kubenswrapper[5116]: I0322 00:31:15.759925 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"e4b4f564-3e34-47db-a558-376d32b6d7e3","Type":"ContainerStarted","Data":"c7e0bb0f8d6531da174daaada44701a4c5b4e858735065eb40a8d1dec71c974c"} Mar 22 00:31:15 crc kubenswrapper[5116]: I0322 00:31:15.795808 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-bundle-2-build" podStartSLOduration=4.79578666 podStartE2EDuration="4.79578666s" podCreationTimestamp="2026-03-22 00:31:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:31:15.789241404 +0000 UTC m=+1346.811542787" watchObservedRunningTime="2026-03-22 00:31:15.79578666 +0000 UTC m=+1346.818088033" Mar 22 00:31:19 crc kubenswrapper[5116]: I0322 00:31:19.791026 5116 generic.go:358] "Generic (PLEG): container finished" podID="e4b4f564-3e34-47db-a558-376d32b6d7e3" containerID="c7e0bb0f8d6531da174daaada44701a4c5b4e858735065eb40a8d1dec71c974c" exitCode=0 Mar 22 00:31:19 crc kubenswrapper[5116]: I0322 00:31:19.791069 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"e4b4f564-3e34-47db-a558-376d32b6d7e3","Type":"ContainerDied","Data":"c7e0bb0f8d6531da174daaada44701a4c5b4e858735065eb40a8d1dec71c974c"} Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.082246 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.155635 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8h8km\" (UniqueName: \"kubernetes.io/projected/e4b4f564-3e34-47db-a558-376d32b6d7e3-kube-api-access-8h8km\") pod \"e4b4f564-3e34-47db-a558-376d32b6d7e3\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.155769 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-container-storage-root\") pod \"e4b4f564-3e34-47db-a558-376d32b6d7e3\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.155857 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-blob-cache\") pod \"e4b4f564-3e34-47db-a558-376d32b6d7e3\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.156400 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-proxy-ca-bundles\") pod \"e4b4f564-3e34-47db-a558-376d32b6d7e3\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.156627 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-ca-bundles\") pod \"e4b4f564-3e34-47db-a558-376d32b6d7e3\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.156952 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e4b4f564-3e34-47db-a558-376d32b6d7e3-buildcachedir\") pod \"e4b4f564-3e34-47db-a558-376d32b6d7e3\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.156978 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-system-configs\") pod \"e4b4f564-3e34-47db-a558-376d32b6d7e3\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.157056 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-buildworkdir\") pod \"e4b4f564-3e34-47db-a558-376d32b6d7e3\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.157097 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e4b4f564-3e34-47db-a558-376d32b6d7e3-node-pullsecrets\") pod \"e4b4f564-3e34-47db-a558-376d32b6d7e3\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.157117 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/e4b4f564-3e34-47db-a558-376d32b6d7e3-builder-dockercfg-qv5f4-pull\") pod \"e4b4f564-3e34-47db-a558-376d32b6d7e3\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.157192 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/e4b4f564-3e34-47db-a558-376d32b6d7e3-builder-dockercfg-qv5f4-push\") pod \"e4b4f564-3e34-47db-a558-376d32b6d7e3\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.157216 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-container-storage-run\") pod \"e4b4f564-3e34-47db-a558-376d32b6d7e3\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.158276 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "e4b4f564-3e34-47db-a558-376d32b6d7e3" (UID: "e4b4f564-3e34-47db-a558-376d32b6d7e3"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.158294 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e4b4f564-3e34-47db-a558-376d32b6d7e3-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "e4b4f564-3e34-47db-a558-376d32b6d7e3" (UID: "e4b4f564-3e34-47db-a558-376d32b6d7e3"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.158578 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "e4b4f564-3e34-47db-a558-376d32b6d7e3" (UID: "e4b4f564-3e34-47db-a558-376d32b6d7e3"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.158702 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e4b4f564-3e34-47db-a558-376d32b6d7e3-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "e4b4f564-3e34-47db-a558-376d32b6d7e3" (UID: "e4b4f564-3e34-47db-a558-376d32b6d7e3"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.159244 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "e4b4f564-3e34-47db-a558-376d32b6d7e3" (UID: "e4b4f564-3e34-47db-a558-376d32b6d7e3"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.159404 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "e4b4f564-3e34-47db-a558-376d32b6d7e3" (UID: "e4b4f564-3e34-47db-a558-376d32b6d7e3"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.159478 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "e4b4f564-3e34-47db-a558-376d32b6d7e3" (UID: "e4b4f564-3e34-47db-a558-376d32b6d7e3"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.165631 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "e4b4f564-3e34-47db-a558-376d32b6d7e3" (UID: "e4b4f564-3e34-47db-a558-376d32b6d7e3"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.166875 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "e4b4f564-3e34-47db-a558-376d32b6d7e3" (UID: "e4b4f564-3e34-47db-a558-376d32b6d7e3"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.169928 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4b4f564-3e34-47db-a558-376d32b6d7e3-builder-dockercfg-qv5f4-pull" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-pull") pod "e4b4f564-3e34-47db-a558-376d32b6d7e3" (UID: "e4b4f564-3e34-47db-a558-376d32b6d7e3"). InnerVolumeSpecName "builder-dockercfg-qv5f4-pull". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.170371 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4b4f564-3e34-47db-a558-376d32b6d7e3-builder-dockercfg-qv5f4-push" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-push") pod "e4b4f564-3e34-47db-a558-376d32b6d7e3" (UID: "e4b4f564-3e34-47db-a558-376d32b6d7e3"). InnerVolumeSpecName "builder-dockercfg-qv5f4-push". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.172054 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4b4f564-3e34-47db-a558-376d32b6d7e3-kube-api-access-8h8km" (OuterVolumeSpecName: "kube-api-access-8h8km") pod "e4b4f564-3e34-47db-a558-376d32b6d7e3" (UID: "e4b4f564-3e34-47db-a558-376d32b6d7e3"). InnerVolumeSpecName "kube-api-access-8h8km". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.259417 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8h8km\" (UniqueName: \"kubernetes.io/projected/e4b4f564-3e34-47db-a558-376d32b6d7e3-kube-api-access-8h8km\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.259457 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.259472 5116 reconciler_common.go:299] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.259483 5116 reconciler_common.go:299] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.259495 5116 reconciler_common.go:299] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.259507 5116 reconciler_common.go:299] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e4b4f564-3e34-47db-a558-376d32b6d7e3-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.259518 5116 reconciler_common.go:299] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.259529 5116 reconciler_common.go:299] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.259541 5116 reconciler_common.go:299] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e4b4f564-3e34-47db-a558-376d32b6d7e3-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.259553 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/e4b4f564-3e34-47db-a558-376d32b6d7e3-builder-dockercfg-qv5f4-pull\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.259565 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/e4b4f564-3e34-47db-a558-376d32b6d7e3-builder-dockercfg-qv5f4-push\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.259580 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.811208 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"e4b4f564-3e34-47db-a558-376d32b6d7e3","Type":"ContainerDied","Data":"66430cee706b285a6fea6a00a1645b2321d2adfb9b419b2649762ad243ebcf40"} Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.811502 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66430cee706b285a6fea6a00a1645b2321d2adfb9b419b2649762ad243ebcf40" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.811302 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:23 crc kubenswrapper[5116]: I0322 00:31:23.056841 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:31:23 crc kubenswrapper[5116]: I0322 00:31:23.056949 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:31:23 crc kubenswrapper[5116]: I0322 00:31:23.057015 5116 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:31:23 crc kubenswrapper[5116]: I0322 00:31:23.057898 5116 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"49fe123552b6212becb6925edd45f7eeb8e4b240ca3f762ba113af9cc73657f3"} pod="openshift-machine-config-operator/machine-config-daemon-66g6d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 22 00:31:23 crc kubenswrapper[5116]: I0322 00:31:23.058016 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" containerID="cri-o://49fe123552b6212becb6925edd45f7eeb8e4b240ca3f762ba113af9cc73657f3" gracePeriod=600 Mar 22 00:31:23 crc kubenswrapper[5116]: I0322 00:31:23.828475 5116 generic.go:358] "Generic (PLEG): container finished" podID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerID="49fe123552b6212becb6925edd45f7eeb8e4b240ca3f762ba113af9cc73657f3" exitCode=0 Mar 22 00:31:23 crc kubenswrapper[5116]: I0322 00:31:23.828543 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerDied","Data":"49fe123552b6212becb6925edd45f7eeb8e4b240ca3f762ba113af9cc73657f3"} Mar 22 00:31:23 crc kubenswrapper[5116]: I0322 00:31:23.828939 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerStarted","Data":"d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf"} Mar 22 00:31:23 crc kubenswrapper[5116]: I0322 00:31:23.828966 5116 scope.go:117] "RemoveContainer" containerID="b7eab597c02f83e56182c2492582a8c35b3583e68514ad1f9af8e387f2ee97bd" Mar 22 00:31:24 crc kubenswrapper[5116]: I0322 00:31:24.757624 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 22 00:31:24 crc kubenswrapper[5116]: I0322 00:31:24.758360 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e4b4f564-3e34-47db-a558-376d32b6d7e3" containerName="git-clone" Mar 22 00:31:24 crc kubenswrapper[5116]: I0322 00:31:24.758374 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4b4f564-3e34-47db-a558-376d32b6d7e3" containerName="git-clone" Mar 22 00:31:24 crc kubenswrapper[5116]: I0322 00:31:24.758396 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e4b4f564-3e34-47db-a558-376d32b6d7e3" containerName="manage-dockerfile" Mar 22 00:31:24 crc kubenswrapper[5116]: I0322 00:31:24.758404 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4b4f564-3e34-47db-a558-376d32b6d7e3" containerName="manage-dockerfile" Mar 22 00:31:24 crc kubenswrapper[5116]: I0322 00:31:24.758424 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e4b4f564-3e34-47db-a558-376d32b6d7e3" containerName="docker-build" Mar 22 00:31:24 crc kubenswrapper[5116]: I0322 00:31:24.758431 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4b4f564-3e34-47db-a558-376d32b6d7e3" containerName="docker-build" Mar 22 00:31:24 crc kubenswrapper[5116]: I0322 00:31:24.758586 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="e4b4f564-3e34-47db-a558-376d32b6d7e3" containerName="docker-build" Mar 22 00:31:24 crc kubenswrapper[5116]: I0322 00:31:24.947951 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 22 00:31:24 crc kubenswrapper[5116]: I0322 00:31:24.948320 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:24 crc kubenswrapper[5116]: I0322 00:31:24.950945 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"smart-gateway-operator-bundle-1-ca\"" Mar 22 00:31:24 crc kubenswrapper[5116]: I0322 00:31:24.951037 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"smart-gateway-operator-bundle-1-sys-config\"" Mar 22 00:31:24 crc kubenswrapper[5116]: I0322 00:31:24.951378 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"smart-gateway-operator-bundle-1-global-ca\"" Mar 22 00:31:24 crc kubenswrapper[5116]: I0322 00:31:24.953621 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"builder-dockercfg-qv5f4\"" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.027049 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.027116 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.027314 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.027513 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqnrq\" (UniqueName: \"kubernetes.io/projected/5d404dae-83c3-4875-8b37-3240f8a35259-kube-api-access-zqnrq\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.027592 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5d404dae-83c3-4875-8b37-3240f8a35259-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.027749 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.027821 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.027987 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/5d404dae-83c3-4875-8b37-3240f8a35259-builder-dockercfg-qv5f4-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.028017 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.028095 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/5d404dae-83c3-4875-8b37-3240f8a35259-builder-dockercfg-qv5f4-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.028155 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5d404dae-83c3-4875-8b37-3240f8a35259-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.028201 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.129698 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.129750 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.129791 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.129997 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zqnrq\" (UniqueName: \"kubernetes.io/projected/5d404dae-83c3-4875-8b37-3240f8a35259-kube-api-access-zqnrq\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.130101 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5d404dae-83c3-4875-8b37-3240f8a35259-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.130185 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5d404dae-83c3-4875-8b37-3240f8a35259-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.130251 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.130282 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.130334 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.130378 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.130396 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/5d404dae-83c3-4875-8b37-3240f8a35259-builder-dockercfg-qv5f4-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.130455 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.130537 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/5d404dae-83c3-4875-8b37-3240f8a35259-builder-dockercfg-qv5f4-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.130590 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5d404dae-83c3-4875-8b37-3240f8a35259-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.130615 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.130733 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.130745 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.130845 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.131093 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.131099 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5d404dae-83c3-4875-8b37-3240f8a35259-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.131209 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.138484 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/5d404dae-83c3-4875-8b37-3240f8a35259-builder-dockercfg-qv5f4-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.139231 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/5d404dae-83c3-4875-8b37-3240f8a35259-builder-dockercfg-qv5f4-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.147970 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqnrq\" (UniqueName: \"kubernetes.io/projected/5d404dae-83c3-4875-8b37-3240f8a35259-kube-api-access-zqnrq\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.268708 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.706934 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 22 00:31:25 crc kubenswrapper[5116]: W0322 00:31:25.710900 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d404dae_83c3_4875_8b37_3240f8a35259.slice/crio-57827d37053f3315c5c6bcc6af9143071581dab86aae5c8f52624b7c138e80e7 WatchSource:0}: Error finding container 57827d37053f3315c5c6bcc6af9143071581dab86aae5c8f52624b7c138e80e7: Status 404 returned error can't find the container with id 57827d37053f3315c5c6bcc6af9143071581dab86aae5c8f52624b7c138e80e7 Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.853216 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"5d404dae-83c3-4875-8b37-3240f8a35259","Type":"ContainerStarted","Data":"57827d37053f3315c5c6bcc6af9143071581dab86aae5c8f52624b7c138e80e7"} Mar 22 00:31:26 crc kubenswrapper[5116]: I0322 00:31:26.861962 5116 generic.go:358] "Generic (PLEG): container finished" podID="5d404dae-83c3-4875-8b37-3240f8a35259" containerID="095ee6b2ca6fc427f5a465f41b63b9951a947f316d689c28f653b52df20fd554" exitCode=0 Mar 22 00:31:26 crc kubenswrapper[5116]: I0322 00:31:26.862097 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"5d404dae-83c3-4875-8b37-3240f8a35259","Type":"ContainerDied","Data":"095ee6b2ca6fc427f5a465f41b63b9951a947f316d689c28f653b52df20fd554"} Mar 22 00:31:27 crc kubenswrapper[5116]: I0322 00:31:27.874532 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_5d404dae-83c3-4875-8b37-3240f8a35259/docker-build/0.log" Mar 22 00:31:27 crc kubenswrapper[5116]: I0322 00:31:27.875514 5116 generic.go:358] "Generic (PLEG): container finished" podID="5d404dae-83c3-4875-8b37-3240f8a35259" containerID="2a6933fce9cc74d356d1d5a231d547fdb2259fd5ca76515682cf2f100750a7ff" exitCode=1 Mar 22 00:31:27 crc kubenswrapper[5116]: I0322 00:31:27.875629 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"5d404dae-83c3-4875-8b37-3240f8a35259","Type":"ContainerDied","Data":"2a6933fce9cc74d356d1d5a231d547fdb2259fd5ca76515682cf2f100750a7ff"} Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.141705 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_5d404dae-83c3-4875-8b37-3240f8a35259/docker-build/0.log" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.142484 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.287294 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-ca-bundles\") pod \"5d404dae-83c3-4875-8b37-3240f8a35259\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.287374 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-buildworkdir\") pod \"5d404dae-83c3-4875-8b37-3240f8a35259\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.287407 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/5d404dae-83c3-4875-8b37-3240f8a35259-builder-dockercfg-qv5f4-pull\") pod \"5d404dae-83c3-4875-8b37-3240f8a35259\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.287488 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-container-storage-run\") pod \"5d404dae-83c3-4875-8b37-3240f8a35259\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.287618 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqnrq\" (UniqueName: \"kubernetes.io/projected/5d404dae-83c3-4875-8b37-3240f8a35259-kube-api-access-zqnrq\") pod \"5d404dae-83c3-4875-8b37-3240f8a35259\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.287642 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-system-configs\") pod \"5d404dae-83c3-4875-8b37-3240f8a35259\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.287664 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-build-blob-cache\") pod \"5d404dae-83c3-4875-8b37-3240f8a35259\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.287714 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5d404dae-83c3-4875-8b37-3240f8a35259-buildcachedir\") pod \"5d404dae-83c3-4875-8b37-3240f8a35259\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.287805 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-proxy-ca-bundles\") pod \"5d404dae-83c3-4875-8b37-3240f8a35259\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.287836 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5d404dae-83c3-4875-8b37-3240f8a35259-node-pullsecrets\") pod \"5d404dae-83c3-4875-8b37-3240f8a35259\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.287903 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-container-storage-root\") pod \"5d404dae-83c3-4875-8b37-3240f8a35259\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.287944 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/5d404dae-83c3-4875-8b37-3240f8a35259-builder-dockercfg-qv5f4-push\") pod \"5d404dae-83c3-4875-8b37-3240f8a35259\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.288259 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "5d404dae-83c3-4875-8b37-3240f8a35259" (UID: "5d404dae-83c3-4875-8b37-3240f8a35259"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.288581 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5d404dae-83c3-4875-8b37-3240f8a35259-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "5d404dae-83c3-4875-8b37-3240f8a35259" (UID: "5d404dae-83c3-4875-8b37-3240f8a35259"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.288604 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5d404dae-83c3-4875-8b37-3240f8a35259-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "5d404dae-83c3-4875-8b37-3240f8a35259" (UID: "5d404dae-83c3-4875-8b37-3240f8a35259"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.288643 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "5d404dae-83c3-4875-8b37-3240f8a35259" (UID: "5d404dae-83c3-4875-8b37-3240f8a35259"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.288717 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "5d404dae-83c3-4875-8b37-3240f8a35259" (UID: "5d404dae-83c3-4875-8b37-3240f8a35259"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.289113 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "5d404dae-83c3-4875-8b37-3240f8a35259" (UID: "5d404dae-83c3-4875-8b37-3240f8a35259"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.289646 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "5d404dae-83c3-4875-8b37-3240f8a35259" (UID: "5d404dae-83c3-4875-8b37-3240f8a35259"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.290590 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "5d404dae-83c3-4875-8b37-3240f8a35259" (UID: "5d404dae-83c3-4875-8b37-3240f8a35259"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.292504 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "5d404dae-83c3-4875-8b37-3240f8a35259" (UID: "5d404dae-83c3-4875-8b37-3240f8a35259"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.296030 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d404dae-83c3-4875-8b37-3240f8a35259-kube-api-access-zqnrq" (OuterVolumeSpecName: "kube-api-access-zqnrq") pod "5d404dae-83c3-4875-8b37-3240f8a35259" (UID: "5d404dae-83c3-4875-8b37-3240f8a35259"). InnerVolumeSpecName "kube-api-access-zqnrq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.297033 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d404dae-83c3-4875-8b37-3240f8a35259-builder-dockercfg-qv5f4-push" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-push") pod "5d404dae-83c3-4875-8b37-3240f8a35259" (UID: "5d404dae-83c3-4875-8b37-3240f8a35259"). InnerVolumeSpecName "builder-dockercfg-qv5f4-push". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.302195 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d404dae-83c3-4875-8b37-3240f8a35259-builder-dockercfg-qv5f4-pull" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-pull") pod "5d404dae-83c3-4875-8b37-3240f8a35259" (UID: "5d404dae-83c3-4875-8b37-3240f8a35259"). InnerVolumeSpecName "builder-dockercfg-qv5f4-pull". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.389070 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zqnrq\" (UniqueName: \"kubernetes.io/projected/5d404dae-83c3-4875-8b37-3240f8a35259-kube-api-access-zqnrq\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.389120 5116 reconciler_common.go:299] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.389136 5116 reconciler_common.go:299] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.389146 5116 reconciler_common.go:299] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5d404dae-83c3-4875-8b37-3240f8a35259-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.389158 5116 reconciler_common.go:299] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.389190 5116 reconciler_common.go:299] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5d404dae-83c3-4875-8b37-3240f8a35259-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.389201 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.389212 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/5d404dae-83c3-4875-8b37-3240f8a35259-builder-dockercfg-qv5f4-push\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.389223 5116 reconciler_common.go:299] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.389233 5116 reconciler_common.go:299] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.389245 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/5d404dae-83c3-4875-8b37-3240f8a35259-builder-dockercfg-qv5f4-pull\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.389255 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.890089 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_5d404dae-83c3-4875-8b37-3240f8a35259/docker-build/0.log" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.890795 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.890803 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"5d404dae-83c3-4875-8b37-3240f8a35259","Type":"ContainerDied","Data":"57827d37053f3315c5c6bcc6af9143071581dab86aae5c8f52624b7c138e80e7"} Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.890868 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57827d37053f3315c5c6bcc6af9143071581dab86aae5c8f52624b7c138e80e7" Mar 22 00:31:35 crc kubenswrapper[5116]: I0322 00:31:35.283813 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 22 00:31:35 crc kubenswrapper[5116]: I0322 00:31:35.293403 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 22 00:31:35 crc kubenswrapper[5116]: I0322 00:31:35.707697 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d404dae-83c3-4875-8b37-3240f8a35259" path="/var/lib/kubelet/pods/5d404dae-83c3-4875-8b37-3240f8a35259/volumes" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.887442 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.888411 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d404dae-83c3-4875-8b37-3240f8a35259" containerName="manage-dockerfile" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.888429 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d404dae-83c3-4875-8b37-3240f8a35259" containerName="manage-dockerfile" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.888444 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d404dae-83c3-4875-8b37-3240f8a35259" containerName="docker-build" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.888452 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d404dae-83c3-4875-8b37-3240f8a35259" containerName="docker-build" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.888617 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d404dae-83c3-4875-8b37-3240f8a35259" containerName="docker-build" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.903264 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.907106 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"smart-gateway-operator-bundle-2-ca\"" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.907153 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"smart-gateway-operator-bundle-2-global-ca\"" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.907112 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"builder-dockercfg-qv5f4\"" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.907955 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"smart-gateway-operator-bundle-2-sys-config\"" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.919727 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.989922 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/af329613-40e2-4658-86b9-a1d39a51a9ac-builder-dockercfg-qv5f4-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.989998 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/af329613-40e2-4658-86b9-a1d39a51a9ac-builder-dockercfg-qv5f4-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.990038 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.990094 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/af329613-40e2-4658-86b9-a1d39a51a9ac-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.990115 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.990196 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.990282 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.990399 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/af329613-40e2-4658-86b9-a1d39a51a9ac-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.990425 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.990545 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6fhl\" (UniqueName: \"kubernetes.io/projected/af329613-40e2-4658-86b9-a1d39a51a9ac-kube-api-access-b6fhl\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.990596 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.990654 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.092417 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b6fhl\" (UniqueName: \"kubernetes.io/projected/af329613-40e2-4658-86b9-a1d39a51a9ac-kube-api-access-b6fhl\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.092461 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.092481 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.092665 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/af329613-40e2-4658-86b9-a1d39a51a9ac-builder-dockercfg-qv5f4-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.093701 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/af329613-40e2-4658-86b9-a1d39a51a9ac-builder-dockercfg-qv5f4-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.093734 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.093522 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.093869 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/af329613-40e2-4658-86b9-a1d39a51a9ac-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.093919 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.093947 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.093965 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.094011 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/af329613-40e2-4658-86b9-a1d39a51a9ac-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.093265 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.094034 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.094043 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/af329613-40e2-4658-86b9-a1d39a51a9ac-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.094081 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/af329613-40e2-4658-86b9-a1d39a51a9ac-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.094054 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.094241 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.094332 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.094479 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.094771 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.100061 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/af329613-40e2-4658-86b9-a1d39a51a9ac-builder-dockercfg-qv5f4-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.100065 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/af329613-40e2-4658-86b9-a1d39a51a9ac-builder-dockercfg-qv5f4-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.109490 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6fhl\" (UniqueName: \"kubernetes.io/projected/af329613-40e2-4658-86b9-a1d39a51a9ac-kube-api-access-b6fhl\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.220514 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.461024 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.955669 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"af329613-40e2-4658-86b9-a1d39a51a9ac","Type":"ContainerStarted","Data":"8b809c921fad586c32a2fe64225aefd18ac9e19113f917fa44e139134998184b"} Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.956065 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"af329613-40e2-4658-86b9-a1d39a51a9ac","Type":"ContainerStarted","Data":"c86a8ab826af7ddc8585f7822b5a9d7c3603da327e396daa6ea858ea8f24b95e"} Mar 22 00:31:38 crc kubenswrapper[5116]: I0322 00:31:38.987991 5116 generic.go:358] "Generic (PLEG): container finished" podID="af329613-40e2-4658-86b9-a1d39a51a9ac" containerID="8b809c921fad586c32a2fe64225aefd18ac9e19113f917fa44e139134998184b" exitCode=0 Mar 22 00:31:38 crc kubenswrapper[5116]: I0322 00:31:38.988044 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"af329613-40e2-4658-86b9-a1d39a51a9ac","Type":"ContainerDied","Data":"8b809c921fad586c32a2fe64225aefd18ac9e19113f917fa44e139134998184b"} Mar 22 00:31:40 crc kubenswrapper[5116]: I0322 00:31:40.000584 5116 generic.go:358] "Generic (PLEG): container finished" podID="af329613-40e2-4658-86b9-a1d39a51a9ac" containerID="b4960de0e26fafe9b64394aff120e75e95fa9e842b754124cc9dedd6c2af58c4" exitCode=0 Mar 22 00:31:40 crc kubenswrapper[5116]: I0322 00:31:40.000807 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"af329613-40e2-4658-86b9-a1d39a51a9ac","Type":"ContainerDied","Data":"b4960de0e26fafe9b64394aff120e75e95fa9e842b754124cc9dedd6c2af58c4"} Mar 22 00:31:40 crc kubenswrapper[5116]: I0322 00:31:40.044389 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-2-build_af329613-40e2-4658-86b9-a1d39a51a9ac/manage-dockerfile/0.log" Mar 22 00:31:41 crc kubenswrapper[5116]: I0322 00:31:41.012453 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"af329613-40e2-4658-86b9-a1d39a51a9ac","Type":"ContainerStarted","Data":"065c9d4f1db0c26993215352e7cce4dde4864d6117683d62a0e4adac61c43f63"} Mar 22 00:31:41 crc kubenswrapper[5116]: I0322 00:31:41.038088 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-bundle-2-build" podStartSLOduration=5.038071404 podStartE2EDuration="5.038071404s" podCreationTimestamp="2026-03-22 00:31:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:31:41.03541997 +0000 UTC m=+1372.057721343" watchObservedRunningTime="2026-03-22 00:31:41.038071404 +0000 UTC m=+1372.060372777" Mar 22 00:31:46 crc kubenswrapper[5116]: I0322 00:31:46.051139 5116 generic.go:358] "Generic (PLEG): container finished" podID="af329613-40e2-4658-86b9-a1d39a51a9ac" containerID="065c9d4f1db0c26993215352e7cce4dde4864d6117683d62a0e4adac61c43f63" exitCode=0 Mar 22 00:31:46 crc kubenswrapper[5116]: I0322 00:31:46.051214 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"af329613-40e2-4658-86b9-a1d39a51a9ac","Type":"ContainerDied","Data":"065c9d4f1db0c26993215352e7cce4dde4864d6117683d62a0e4adac61c43f63"} Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.329912 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.441643 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/af329613-40e2-4658-86b9-a1d39a51a9ac-builder-dockercfg-qv5f4-pull\") pod \"af329613-40e2-4658-86b9-a1d39a51a9ac\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.441683 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-container-storage-run\") pod \"af329613-40e2-4658-86b9-a1d39a51a9ac\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.441714 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-container-storage-root\") pod \"af329613-40e2-4658-86b9-a1d39a51a9ac\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.441760 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-proxy-ca-bundles\") pod \"af329613-40e2-4658-86b9-a1d39a51a9ac\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.441836 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6fhl\" (UniqueName: \"kubernetes.io/projected/af329613-40e2-4658-86b9-a1d39a51a9ac-kube-api-access-b6fhl\") pod \"af329613-40e2-4658-86b9-a1d39a51a9ac\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.441907 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/af329613-40e2-4658-86b9-a1d39a51a9ac-node-pullsecrets\") pod \"af329613-40e2-4658-86b9-a1d39a51a9ac\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.441934 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-ca-bundles\") pod \"af329613-40e2-4658-86b9-a1d39a51a9ac\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.442019 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af329613-40e2-4658-86b9-a1d39a51a9ac-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "af329613-40e2-4658-86b9-a1d39a51a9ac" (UID: "af329613-40e2-4658-86b9-a1d39a51a9ac"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.442075 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-build-blob-cache\") pod \"af329613-40e2-4658-86b9-a1d39a51a9ac\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.442397 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "af329613-40e2-4658-86b9-a1d39a51a9ac" (UID: "af329613-40e2-4658-86b9-a1d39a51a9ac"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.443012 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "af329613-40e2-4658-86b9-a1d39a51a9ac" (UID: "af329613-40e2-4658-86b9-a1d39a51a9ac"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.443002 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "af329613-40e2-4658-86b9-a1d39a51a9ac" (UID: "af329613-40e2-4658-86b9-a1d39a51a9ac"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.443467 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "af329613-40e2-4658-86b9-a1d39a51a9ac" (UID: "af329613-40e2-4658-86b9-a1d39a51a9ac"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.443569 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "af329613-40e2-4658-86b9-a1d39a51a9ac" (UID: "af329613-40e2-4658-86b9-a1d39a51a9ac"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.442100 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-system-configs\") pod \"af329613-40e2-4658-86b9-a1d39a51a9ac\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.443656 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-buildworkdir\") pod \"af329613-40e2-4658-86b9-a1d39a51a9ac\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.443684 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/af329613-40e2-4658-86b9-a1d39a51a9ac-buildcachedir\") pod \"af329613-40e2-4658-86b9-a1d39a51a9ac\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.443845 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af329613-40e2-4658-86b9-a1d39a51a9ac-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "af329613-40e2-4658-86b9-a1d39a51a9ac" (UID: "af329613-40e2-4658-86b9-a1d39a51a9ac"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.444769 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "af329613-40e2-4658-86b9-a1d39a51a9ac" (UID: "af329613-40e2-4658-86b9-a1d39a51a9ac"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.444836 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/af329613-40e2-4658-86b9-a1d39a51a9ac-builder-dockercfg-qv5f4-push\") pod \"af329613-40e2-4658-86b9-a1d39a51a9ac\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.445462 5116 reconciler_common.go:299] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.445525 5116 reconciler_common.go:299] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/af329613-40e2-4658-86b9-a1d39a51a9ac-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.445538 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.445565 5116 reconciler_common.go:299] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.445578 5116 reconciler_common.go:299] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/af329613-40e2-4658-86b9-a1d39a51a9ac-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.445589 5116 reconciler_common.go:299] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.445602 5116 reconciler_common.go:299] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.445613 5116 reconciler_common.go:299] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.448687 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "af329613-40e2-4658-86b9-a1d39a51a9ac" (UID: "af329613-40e2-4658-86b9-a1d39a51a9ac"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.453893 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af329613-40e2-4658-86b9-a1d39a51a9ac-kube-api-access-b6fhl" (OuterVolumeSpecName: "kube-api-access-b6fhl") pod "af329613-40e2-4658-86b9-a1d39a51a9ac" (UID: "af329613-40e2-4658-86b9-a1d39a51a9ac"). InnerVolumeSpecName "kube-api-access-b6fhl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.453897 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af329613-40e2-4658-86b9-a1d39a51a9ac-builder-dockercfg-qv5f4-push" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-push") pod "af329613-40e2-4658-86b9-a1d39a51a9ac" (UID: "af329613-40e2-4658-86b9-a1d39a51a9ac"). InnerVolumeSpecName "builder-dockercfg-qv5f4-push". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.456895 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af329613-40e2-4658-86b9-a1d39a51a9ac-builder-dockercfg-qv5f4-pull" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-pull") pod "af329613-40e2-4658-86b9-a1d39a51a9ac" (UID: "af329613-40e2-4658-86b9-a1d39a51a9ac"). InnerVolumeSpecName "builder-dockercfg-qv5f4-pull". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.546489 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/af329613-40e2-4658-86b9-a1d39a51a9ac-builder-dockercfg-qv5f4-push\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.546521 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/af329613-40e2-4658-86b9-a1d39a51a9ac-builder-dockercfg-qv5f4-pull\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.546532 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.546541 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b6fhl\" (UniqueName: \"kubernetes.io/projected/af329613-40e2-4658-86b9-a1d39a51a9ac-kube-api-access-b6fhl\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:48 crc kubenswrapper[5116]: I0322 00:31:48.073953 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"af329613-40e2-4658-86b9-a1d39a51a9ac","Type":"ContainerDied","Data":"c86a8ab826af7ddc8585f7822b5a9d7c3603da327e396daa6ea858ea8f24b95e"} Mar 22 00:31:48 crc kubenswrapper[5116]: I0322 00:31:48.073987 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c86a8ab826af7ddc8585f7822b5a9d7c3603da327e396daa6ea858ea8f24b95e" Mar 22 00:31:48 crc kubenswrapper[5116]: I0322 00:31:48.074059 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.132488 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29568992-2pdr9"] Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.133701 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af329613-40e2-4658-86b9-a1d39a51a9ac" containerName="manage-dockerfile" Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.133715 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="af329613-40e2-4658-86b9-a1d39a51a9ac" containerName="manage-dockerfile" Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.133735 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af329613-40e2-4658-86b9-a1d39a51a9ac" containerName="docker-build" Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.133741 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="af329613-40e2-4658-86b9-a1d39a51a9ac" containerName="docker-build" Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.133757 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af329613-40e2-4658-86b9-a1d39a51a9ac" containerName="git-clone" Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.133763 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="af329613-40e2-4658-86b9-a1d39a51a9ac" containerName="git-clone" Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.133866 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="af329613-40e2-4658-86b9-a1d39a51a9ac" containerName="docker-build" Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.141088 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568992-2pdr9" Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.143687 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.144067 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-zsw2q\"" Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.145422 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568992-2pdr9"] Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.149865 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.220740 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk6s4\" (UniqueName: \"kubernetes.io/projected/6a9e5029-c0b6-4edb-a790-d67f5791bab2-kube-api-access-fk6s4\") pod \"auto-csr-approver-29568992-2pdr9\" (UID: \"6a9e5029-c0b6-4edb-a790-d67f5791bab2\") " pod="openshift-infra/auto-csr-approver-29568992-2pdr9" Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.322010 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fk6s4\" (UniqueName: \"kubernetes.io/projected/6a9e5029-c0b6-4edb-a790-d67f5791bab2-kube-api-access-fk6s4\") pod \"auto-csr-approver-29568992-2pdr9\" (UID: \"6a9e5029-c0b6-4edb-a790-d67f5791bab2\") " pod="openshift-infra/auto-csr-approver-29568992-2pdr9" Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.342028 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk6s4\" (UniqueName: \"kubernetes.io/projected/6a9e5029-c0b6-4edb-a790-d67f5791bab2-kube-api-access-fk6s4\") pod \"auto-csr-approver-29568992-2pdr9\" (UID: \"6a9e5029-c0b6-4edb-a790-d67f5791bab2\") " pod="openshift-infra/auto-csr-approver-29568992-2pdr9" Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.466946 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568992-2pdr9" Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.678062 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568992-2pdr9"] Mar 22 00:32:01 crc kubenswrapper[5116]: I0322 00:32:01.179729 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568992-2pdr9" event={"ID":"6a9e5029-c0b6-4edb-a790-d67f5791bab2","Type":"ContainerStarted","Data":"107d510fc55bcf7ed0d7f1994bf108101a66973dfce548697cc10916b771397e"} Mar 22 00:32:02 crc kubenswrapper[5116]: I0322 00:32:02.187345 5116 generic.go:358] "Generic (PLEG): container finished" podID="6a9e5029-c0b6-4edb-a790-d67f5791bab2" containerID="b895d59dbbcf506c3c0f496201be78280c2900728c487398d81316e12e931fac" exitCode=0 Mar 22 00:32:02 crc kubenswrapper[5116]: I0322 00:32:02.187436 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568992-2pdr9" event={"ID":"6a9e5029-c0b6-4edb-a790-d67f5791bab2","Type":"ContainerDied","Data":"b895d59dbbcf506c3c0f496201be78280c2900728c487398d81316e12e931fac"} Mar 22 00:32:03 crc kubenswrapper[5116]: I0322 00:32:03.441850 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568992-2pdr9" Mar 22 00:32:03 crc kubenswrapper[5116]: I0322 00:32:03.567021 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fk6s4\" (UniqueName: \"kubernetes.io/projected/6a9e5029-c0b6-4edb-a790-d67f5791bab2-kube-api-access-fk6s4\") pod \"6a9e5029-c0b6-4edb-a790-d67f5791bab2\" (UID: \"6a9e5029-c0b6-4edb-a790-d67f5791bab2\") " Mar 22 00:32:03 crc kubenswrapper[5116]: I0322 00:32:03.576228 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a9e5029-c0b6-4edb-a790-d67f5791bab2-kube-api-access-fk6s4" (OuterVolumeSpecName: "kube-api-access-fk6s4") pod "6a9e5029-c0b6-4edb-a790-d67f5791bab2" (UID: "6a9e5029-c0b6-4edb-a790-d67f5791bab2"). InnerVolumeSpecName "kube-api-access-fk6s4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:32:03 crc kubenswrapper[5116]: I0322 00:32:03.669889 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fk6s4\" (UniqueName: \"kubernetes.io/projected/6a9e5029-c0b6-4edb-a790-d67f5791bab2-kube-api-access-fk6s4\") on node \"crc\" DevicePath \"\"" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.210864 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568992-2pdr9" event={"ID":"6a9e5029-c0b6-4edb-a790-d67f5791bab2","Type":"ContainerDied","Data":"107d510fc55bcf7ed0d7f1994bf108101a66973dfce548697cc10916b771397e"} Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.211267 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="107d510fc55bcf7ed0d7f1994bf108101a66973dfce548697cc10916b771397e" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.211136 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568992-2pdr9" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.387621 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.388272 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a9e5029-c0b6-4edb-a790-d67f5791bab2" containerName="oc" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.388288 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a9e5029-c0b6-4edb-a790-d67f5791bab2" containerName="oc" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.388379 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="6a9e5029-c0b6-4edb-a790-d67f5791bab2" containerName="oc" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.393532 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.395462 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"service-telemetry-framework-index-dockercfg\"" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.395461 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"service-telemetry-framework-index-1-sys-config\"" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.395982 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"builder-dockercfg-qv5f4\"" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.396223 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"service-telemetry-framework-index-1-ca\"" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.396891 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"service-telemetry-framework-index-1-global-ca\"" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.410832 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.481221 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j8rm\" (UniqueName: \"kubernetes.io/projected/c31bf437-0e3b-460f-ba0c-4b172f455201-kube-api-access-6j8rm\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.481284 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.481312 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.481400 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.481441 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-builder-dockercfg-qv5f4-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.481540 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.481590 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c31bf437-0e3b-460f-ba0c-4b172f455201-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.481641 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c31bf437-0e3b-460f-ba0c-4b172f455201-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.481673 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.481697 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.481821 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.481887 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.481948 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-builder-dockercfg-qv5f4-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.495845 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29568986-l778x"] Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.502136 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29568986-l778x"] Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.582993 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c31bf437-0e3b-460f-ba0c-4b172f455201-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.583035 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.583055 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.583086 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.583106 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.583122 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-builder-dockercfg-qv5f4-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.583167 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6j8rm\" (UniqueName: \"kubernetes.io/projected/c31bf437-0e3b-460f-ba0c-4b172f455201-kube-api-access-6j8rm\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.583254 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c31bf437-0e3b-460f-ba0c-4b172f455201-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.583434 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.583470 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.583526 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.583568 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-builder-dockercfg-qv5f4-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.583636 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.583744 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.583823 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.584205 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.584252 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.584318 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.584324 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.584375 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c31bf437-0e3b-460f-ba0c-4b172f455201-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.584446 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c31bf437-0e3b-460f-ba0c-4b172f455201-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.584571 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.587717 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-builder-dockercfg-qv5f4-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.587945 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.590987 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-builder-dockercfg-qv5f4-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.602398 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j8rm\" (UniqueName: \"kubernetes.io/projected/c31bf437-0e3b-460f-ba0c-4b172f455201-kube-api-access-6j8rm\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.711887 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.919431 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Mar 22 00:32:04 crc kubenswrapper[5116]: W0322 00:32:04.928263 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc31bf437_0e3b_460f_ba0c_4b172f455201.slice/crio-2e1b040ee59464ba98cbd13a35e5abb1f7ad61dfebbd95563755f14726edf22c WatchSource:0}: Error finding container 2e1b040ee59464ba98cbd13a35e5abb1f7ad61dfebbd95563755f14726edf22c: Status 404 returned error can't find the container with id 2e1b040ee59464ba98cbd13a35e5abb1f7ad61dfebbd95563755f14726edf22c Mar 22 00:32:05 crc kubenswrapper[5116]: I0322 00:32:05.216765 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"c31bf437-0e3b-460f-ba0c-4b172f455201","Type":"ContainerStarted","Data":"2e1b040ee59464ba98cbd13a35e5abb1f7ad61dfebbd95563755f14726edf22c"} Mar 22 00:32:05 crc kubenswrapper[5116]: I0322 00:32:05.704062 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d42dbb69-e840-4b6a-b719-52396f82919e" path="/var/lib/kubelet/pods/d42dbb69-e840-4b6a-b719-52396f82919e/volumes" Mar 22 00:32:06 crc kubenswrapper[5116]: I0322 00:32:06.229116 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"c31bf437-0e3b-460f-ba0c-4b172f455201","Type":"ContainerStarted","Data":"b93bd1618b4a88d7aa23a73088f70123e675abc6fffcdbe560999c59ba33e7db"} Mar 22 00:32:07 crc kubenswrapper[5116]: I0322 00:32:07.240802 5116 generic.go:358] "Generic (PLEG): container finished" podID="c31bf437-0e3b-460f-ba0c-4b172f455201" containerID="b93bd1618b4a88d7aa23a73088f70123e675abc6fffcdbe560999c59ba33e7db" exitCode=0 Mar 22 00:32:07 crc kubenswrapper[5116]: I0322 00:32:07.240878 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"c31bf437-0e3b-460f-ba0c-4b172f455201","Type":"ContainerDied","Data":"b93bd1618b4a88d7aa23a73088f70123e675abc6fffcdbe560999c59ba33e7db"} Mar 22 00:32:08 crc kubenswrapper[5116]: I0322 00:32:08.250102 5116 generic.go:358] "Generic (PLEG): container finished" podID="c31bf437-0e3b-460f-ba0c-4b172f455201" containerID="3230f47f645627d0982892f22f2beeaf875033c794a1e3775aa92d829e7da434" exitCode=0 Mar 22 00:32:08 crc kubenswrapper[5116]: I0322 00:32:08.250240 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"c31bf437-0e3b-460f-ba0c-4b172f455201","Type":"ContainerDied","Data":"3230f47f645627d0982892f22f2beeaf875033c794a1e3775aa92d829e7da434"} Mar 22 00:32:08 crc kubenswrapper[5116]: I0322 00:32:08.287498 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-framework-index-1-build_c31bf437-0e3b-460f-ba0c-4b172f455201/manage-dockerfile/0.log" Mar 22 00:32:09 crc kubenswrapper[5116]: I0322 00:32:09.260317 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"c31bf437-0e3b-460f-ba0c-4b172f455201","Type":"ContainerStarted","Data":"74a06b5b226c72a9a8d2bcbdf47193630da4297aa507d3bbc891de1064f2eb18"} Mar 22 00:32:09 crc kubenswrapper[5116]: I0322 00:32:09.299887 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-framework-index-1-build" podStartSLOduration=5.299867097 podStartE2EDuration="5.299867097s" podCreationTimestamp="2026-03-22 00:32:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:32:09.291134762 +0000 UTC m=+1400.313436155" watchObservedRunningTime="2026-03-22 00:32:09.299867097 +0000 UTC m=+1400.322168480" Mar 22 00:32:47 crc kubenswrapper[5116]: I0322 00:32:47.539816 5116 generic.go:358] "Generic (PLEG): container finished" podID="c31bf437-0e3b-460f-ba0c-4b172f455201" containerID="74a06b5b226c72a9a8d2bcbdf47193630da4297aa507d3bbc891de1064f2eb18" exitCode=0 Mar 22 00:32:47 crc kubenswrapper[5116]: I0322 00:32:47.539933 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"c31bf437-0e3b-460f-ba0c-4b172f455201","Type":"ContainerDied","Data":"74a06b5b226c72a9a8d2bcbdf47193630da4297aa507d3bbc891de1064f2eb18"} Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.815281 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.929770 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-proxy-ca-bundles\") pod \"c31bf437-0e3b-460f-ba0c-4b172f455201\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.929843 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-builder-dockercfg-qv5f4-push\") pod \"c31bf437-0e3b-460f-ba0c-4b172f455201\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.929865 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-system-configs\") pod \"c31bf437-0e3b-460f-ba0c-4b172f455201\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.930142 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-buildworkdir\") pod \"c31bf437-0e3b-460f-ba0c-4b172f455201\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.930277 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-builder-dockercfg-qv5f4-pull\") pod \"c31bf437-0e3b-460f-ba0c-4b172f455201\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.930370 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c31bf437-0e3b-460f-ba0c-4b172f455201-node-pullsecrets\") pod \"c31bf437-0e3b-460f-ba0c-4b172f455201\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.930451 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-container-storage-run\") pod \"c31bf437-0e3b-460f-ba0c-4b172f455201\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.930525 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c31bf437-0e3b-460f-ba0c-4b172f455201-buildcachedir\") pod \"c31bf437-0e3b-460f-ba0c-4b172f455201\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.930633 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-build-blob-cache\") pod \"c31bf437-0e3b-460f-ba0c-4b172f455201\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.930728 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-ca-bundles\") pod \"c31bf437-0e3b-460f-ba0c-4b172f455201\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.930853 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"c31bf437-0e3b-460f-ba0c-4b172f455201\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.930967 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6j8rm\" (UniqueName: \"kubernetes.io/projected/c31bf437-0e3b-460f-ba0c-4b172f455201-kube-api-access-6j8rm\") pod \"c31bf437-0e3b-460f-ba0c-4b172f455201\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.931041 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-container-storage-root\") pod \"c31bf437-0e3b-460f-ba0c-4b172f455201\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.930853 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "c31bf437-0e3b-460f-ba0c-4b172f455201" (UID: "c31bf437-0e3b-460f-ba0c-4b172f455201"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.930859 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "c31bf437-0e3b-460f-ba0c-4b172f455201" (UID: "c31bf437-0e3b-460f-ba0c-4b172f455201"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.930889 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c31bf437-0e3b-460f-ba0c-4b172f455201-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "c31bf437-0e3b-460f-ba0c-4b172f455201" (UID: "c31bf437-0e3b-460f-ba0c-4b172f455201"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.930884 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "c31bf437-0e3b-460f-ba0c-4b172f455201" (UID: "c31bf437-0e3b-460f-ba0c-4b172f455201"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.930911 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c31bf437-0e3b-460f-ba0c-4b172f455201-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "c31bf437-0e3b-460f-ba0c-4b172f455201" (UID: "c31bf437-0e3b-460f-ba0c-4b172f455201"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.931715 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "c31bf437-0e3b-460f-ba0c-4b172f455201" (UID: "c31bf437-0e3b-460f-ba0c-4b172f455201"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.932549 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "c31bf437-0e3b-460f-ba0c-4b172f455201" (UID: "c31bf437-0e3b-460f-ba0c-4b172f455201"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.936868 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-service-telemetry-framework-index-dockercfg-user-build-volume" (OuterVolumeSpecName: "service-telemetry-framework-index-dockercfg-user-build-volume") pod "c31bf437-0e3b-460f-ba0c-4b172f455201" (UID: "c31bf437-0e3b-460f-ba0c-4b172f455201"). InnerVolumeSpecName "service-telemetry-framework-index-dockercfg-user-build-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.936944 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c31bf437-0e3b-460f-ba0c-4b172f455201-kube-api-access-6j8rm" (OuterVolumeSpecName: "kube-api-access-6j8rm") pod "c31bf437-0e3b-460f-ba0c-4b172f455201" (UID: "c31bf437-0e3b-460f-ba0c-4b172f455201"). InnerVolumeSpecName "kube-api-access-6j8rm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.937007 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-builder-dockercfg-qv5f4-pull" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-pull") pod "c31bf437-0e3b-460f-ba0c-4b172f455201" (UID: "c31bf437-0e3b-460f-ba0c-4b172f455201"). InnerVolumeSpecName "builder-dockercfg-qv5f4-pull". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.937991 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-builder-dockercfg-qv5f4-push" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-push") pod "c31bf437-0e3b-460f-ba0c-4b172f455201" (UID: "c31bf437-0e3b-460f-ba0c-4b172f455201"). InnerVolumeSpecName "builder-dockercfg-qv5f4-push". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:32:49 crc kubenswrapper[5116]: I0322 00:32:49.032591 5116 reconciler_common.go:299] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:32:49 crc kubenswrapper[5116]: I0322 00:32:49.032626 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-builder-dockercfg-qv5f4-push\") on node \"crc\" DevicePath \"\"" Mar 22 00:32:49 crc kubenswrapper[5116]: I0322 00:32:49.032640 5116 reconciler_common.go:299] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 22 00:32:49 crc kubenswrapper[5116]: I0322 00:32:49.032653 5116 reconciler_common.go:299] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 22 00:32:49 crc kubenswrapper[5116]: I0322 00:32:49.032665 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-builder-dockercfg-qv5f4-pull\") on node \"crc\" DevicePath \"\"" Mar 22 00:32:49 crc kubenswrapper[5116]: I0322 00:32:49.032676 5116 reconciler_common.go:299] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c31bf437-0e3b-460f-ba0c-4b172f455201-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 22 00:32:49 crc kubenswrapper[5116]: I0322 00:32:49.032686 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 22 00:32:49 crc kubenswrapper[5116]: I0322 00:32:49.032696 5116 reconciler_common.go:299] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c31bf437-0e3b-460f-ba0c-4b172f455201-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 22 00:32:49 crc kubenswrapper[5116]: I0322 00:32:49.032723 5116 reconciler_common.go:299] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:32:49 crc kubenswrapper[5116]: I0322 00:32:49.032737 5116 reconciler_common.go:299] "Volume detached for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-service-telemetry-framework-index-dockercfg-user-build-volume\") on node \"crc\" DevicePath \"\"" Mar 22 00:32:49 crc kubenswrapper[5116]: I0322 00:32:49.032752 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6j8rm\" (UniqueName: \"kubernetes.io/projected/c31bf437-0e3b-460f-ba0c-4b172f455201-kube-api-access-6j8rm\") on node \"crc\" DevicePath \"\"" Mar 22 00:32:49 crc kubenswrapper[5116]: I0322 00:32:49.207291 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "c31bf437-0e3b-460f-ba0c-4b172f455201" (UID: "c31bf437-0e3b-460f-ba0c-4b172f455201"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:32:49 crc kubenswrapper[5116]: I0322 00:32:49.236124 5116 reconciler_common.go:299] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 22 00:32:49 crc kubenswrapper[5116]: I0322 00:32:49.558000 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"c31bf437-0e3b-460f-ba0c-4b172f455201","Type":"ContainerDied","Data":"2e1b040ee59464ba98cbd13a35e5abb1f7ad61dfebbd95563755f14726edf22c"} Mar 22 00:32:49 crc kubenswrapper[5116]: I0322 00:32:49.558043 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e1b040ee59464ba98cbd13a35e5abb1f7ad61dfebbd95563755f14726edf22c" Mar 22 00:32:49 crc kubenswrapper[5116]: I0322 00:32:49.558053 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:51 crc kubenswrapper[5116]: I0322 00:32:51.114128 5116 scope.go:117] "RemoveContainer" containerID="795e259e79e05542df53f912755d650098dd25713e478a8d61c158fc1da118ef" Mar 22 00:32:51 crc kubenswrapper[5116]: I0322 00:32:51.227567 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "c31bf437-0e3b-460f-ba0c-4b172f455201" (UID: "c31bf437-0e3b-460f-ba0c-4b172f455201"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:32:51 crc kubenswrapper[5116]: I0322 00:32:51.268318 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 22 00:32:51 crc kubenswrapper[5116]: I0322 00:32:51.840303 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-pbfzj"] Mar 22 00:32:51 crc kubenswrapper[5116]: I0322 00:32:51.841921 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c31bf437-0e3b-460f-ba0c-4b172f455201" containerName="manage-dockerfile" Mar 22 00:32:51 crc kubenswrapper[5116]: I0322 00:32:51.842023 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="c31bf437-0e3b-460f-ba0c-4b172f455201" containerName="manage-dockerfile" Mar 22 00:32:51 crc kubenswrapper[5116]: I0322 00:32:51.842100 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c31bf437-0e3b-460f-ba0c-4b172f455201" containerName="docker-build" Mar 22 00:32:51 crc kubenswrapper[5116]: I0322 00:32:51.843130 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="c31bf437-0e3b-460f-ba0c-4b172f455201" containerName="docker-build" Mar 22 00:32:51 crc kubenswrapper[5116]: I0322 00:32:51.843276 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c31bf437-0e3b-460f-ba0c-4b172f455201" containerName="git-clone" Mar 22 00:32:51 crc kubenswrapper[5116]: I0322 00:32:51.843357 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="c31bf437-0e3b-460f-ba0c-4b172f455201" containerName="git-clone" Mar 22 00:32:51 crc kubenswrapper[5116]: I0322 00:32:51.844705 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="c31bf437-0e3b-460f-ba0c-4b172f455201" containerName="docker-build" Mar 22 00:32:51 crc kubenswrapper[5116]: I0322 00:32:51.873372 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-pbfzj"] Mar 22 00:32:51 crc kubenswrapper[5116]: I0322 00:32:51.873603 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-pbfzj" Mar 22 00:32:51 crc kubenswrapper[5116]: I0322 00:32:51.876653 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"infrawatch-operators-dockercfg-mwrhj\"" Mar 22 00:32:51 crc kubenswrapper[5116]: I0322 00:32:51.977735 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vps4q\" (UniqueName: \"kubernetes.io/projected/79f55d1d-3c22-43be-84f4-9cef2bb268f0-kube-api-access-vps4q\") pod \"infrawatch-operators-pbfzj\" (UID: \"79f55d1d-3c22-43be-84f4-9cef2bb268f0\") " pod="service-telemetry/infrawatch-operators-pbfzj" Mar 22 00:32:52 crc kubenswrapper[5116]: I0322 00:32:52.078919 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vps4q\" (UniqueName: \"kubernetes.io/projected/79f55d1d-3c22-43be-84f4-9cef2bb268f0-kube-api-access-vps4q\") pod \"infrawatch-operators-pbfzj\" (UID: \"79f55d1d-3c22-43be-84f4-9cef2bb268f0\") " pod="service-telemetry/infrawatch-operators-pbfzj" Mar 22 00:32:52 crc kubenswrapper[5116]: I0322 00:32:52.105671 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vps4q\" (UniqueName: \"kubernetes.io/projected/79f55d1d-3c22-43be-84f4-9cef2bb268f0-kube-api-access-vps4q\") pod \"infrawatch-operators-pbfzj\" (UID: \"79f55d1d-3c22-43be-84f4-9cef2bb268f0\") " pod="service-telemetry/infrawatch-operators-pbfzj" Mar 22 00:32:52 crc kubenswrapper[5116]: I0322 00:32:52.196751 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-pbfzj" Mar 22 00:32:52 crc kubenswrapper[5116]: I0322 00:32:52.647897 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-pbfzj"] Mar 22 00:32:52 crc kubenswrapper[5116]: W0322 00:32:52.663471 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79f55d1d_3c22_43be_84f4_9cef2bb268f0.slice/crio-03d28f35ce77fb34c75f7781589e64a095acb527fdbeda78c9d0c617287e2d1f WatchSource:0}: Error finding container 03d28f35ce77fb34c75f7781589e64a095acb527fdbeda78c9d0c617287e2d1f: Status 404 returned error can't find the container with id 03d28f35ce77fb34c75f7781589e64a095acb527fdbeda78c9d0c617287e2d1f Mar 22 00:32:53 crc kubenswrapper[5116]: I0322 00:32:53.583898 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-pbfzj" event={"ID":"79f55d1d-3c22-43be-84f4-9cef2bb268f0","Type":"ContainerStarted","Data":"03d28f35ce77fb34c75f7781589e64a095acb527fdbeda78c9d0c617287e2d1f"} Mar 22 00:32:57 crc kubenswrapper[5116]: I0322 00:32:57.233073 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-pbfzj"] Mar 22 00:32:58 crc kubenswrapper[5116]: I0322 00:32:58.049050 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-g5xjt"] Mar 22 00:32:58 crc kubenswrapper[5116]: I0322 00:32:58.507694 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-g5xjt"] Mar 22 00:32:58 crc kubenswrapper[5116]: I0322 00:32:58.507828 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-g5xjt" Mar 22 00:32:58 crc kubenswrapper[5116]: I0322 00:32:58.569903 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjpx8\" (UniqueName: \"kubernetes.io/projected/8ac45e34-89d0-4eb0-b696-51b35b33b23e-kube-api-access-cjpx8\") pod \"infrawatch-operators-g5xjt\" (UID: \"8ac45e34-89d0-4eb0-b696-51b35b33b23e\") " pod="service-telemetry/infrawatch-operators-g5xjt" Mar 22 00:32:58 crc kubenswrapper[5116]: I0322 00:32:58.670700 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjpx8\" (UniqueName: \"kubernetes.io/projected/8ac45e34-89d0-4eb0-b696-51b35b33b23e-kube-api-access-cjpx8\") pod \"infrawatch-operators-g5xjt\" (UID: \"8ac45e34-89d0-4eb0-b696-51b35b33b23e\") " pod="service-telemetry/infrawatch-operators-g5xjt" Mar 22 00:32:58 crc kubenswrapper[5116]: I0322 00:32:58.710756 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjpx8\" (UniqueName: \"kubernetes.io/projected/8ac45e34-89d0-4eb0-b696-51b35b33b23e-kube-api-access-cjpx8\") pod \"infrawatch-operators-g5xjt\" (UID: \"8ac45e34-89d0-4eb0-b696-51b35b33b23e\") " pod="service-telemetry/infrawatch-operators-g5xjt" Mar 22 00:32:58 crc kubenswrapper[5116]: I0322 00:32:58.826665 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-g5xjt" Mar 22 00:33:02 crc kubenswrapper[5116]: I0322 00:33:02.245929 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-g5xjt"] Mar 22 00:33:02 crc kubenswrapper[5116]: I0322 00:33:02.640874 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-pbfzj" event={"ID":"79f55d1d-3c22-43be-84f4-9cef2bb268f0","Type":"ContainerStarted","Data":"adf459e10bd634af861991406392ca3f3cc87c5dca73ec4feb3e7454496bf6bb"} Mar 22 00:33:02 crc kubenswrapper[5116]: I0322 00:33:02.640996 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="service-telemetry/infrawatch-operators-pbfzj" podUID="79f55d1d-3c22-43be-84f4-9cef2bb268f0" containerName="registry-server" containerID="cri-o://adf459e10bd634af861991406392ca3f3cc87c5dca73ec4feb3e7454496bf6bb" gracePeriod=2 Mar 22 00:33:02 crc kubenswrapper[5116]: I0322 00:33:02.645106 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-g5xjt" event={"ID":"8ac45e34-89d0-4eb0-b696-51b35b33b23e","Type":"ContainerStarted","Data":"57983bd1879f74e47a1934e8861f59b2e62c326b1718640ebba28c521ad77e3c"} Mar 22 00:33:02 crc kubenswrapper[5116]: I0322 00:33:02.645203 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-g5xjt" event={"ID":"8ac45e34-89d0-4eb0-b696-51b35b33b23e","Type":"ContainerStarted","Data":"e1ba8e236edf3c272fdce84bd6bc25926a88c4eaa660cb177b22b2d29a8c98b2"} Mar 22 00:33:02 crc kubenswrapper[5116]: I0322 00:33:02.656605 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-pbfzj" podStartSLOduration=2.149997514 podStartE2EDuration="11.656581926s" podCreationTimestamp="2026-03-22 00:32:51 +0000 UTC" firstStartedPulling="2026-03-22 00:32:52.66542245 +0000 UTC m=+1443.687723843" lastFinishedPulling="2026-03-22 00:33:02.172006872 +0000 UTC m=+1453.194308255" observedRunningTime="2026-03-22 00:33:02.654393627 +0000 UTC m=+1453.676695010" watchObservedRunningTime="2026-03-22 00:33:02.656581926 +0000 UTC m=+1453.678883299" Mar 22 00:33:02 crc kubenswrapper[5116]: I0322 00:33:02.675307 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-g5xjt" podStartSLOduration=4.565823842 podStartE2EDuration="4.675273902s" podCreationTimestamp="2026-03-22 00:32:58 +0000 UTC" firstStartedPulling="2026-03-22 00:33:02.258673287 +0000 UTC m=+1453.280974670" lastFinishedPulling="2026-03-22 00:33:02.368123357 +0000 UTC m=+1453.390424730" observedRunningTime="2026-03-22 00:33:02.670792761 +0000 UTC m=+1453.693094164" watchObservedRunningTime="2026-03-22 00:33:02.675273902 +0000 UTC m=+1453.697575315" Mar 22 00:33:03 crc kubenswrapper[5116]: I0322 00:33:03.184542 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-pbfzj" Mar 22 00:33:03 crc kubenswrapper[5116]: I0322 00:33:03.230446 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vps4q\" (UniqueName: \"kubernetes.io/projected/79f55d1d-3c22-43be-84f4-9cef2bb268f0-kube-api-access-vps4q\") pod \"79f55d1d-3c22-43be-84f4-9cef2bb268f0\" (UID: \"79f55d1d-3c22-43be-84f4-9cef2bb268f0\") " Mar 22 00:33:03 crc kubenswrapper[5116]: I0322 00:33:03.237488 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79f55d1d-3c22-43be-84f4-9cef2bb268f0-kube-api-access-vps4q" (OuterVolumeSpecName: "kube-api-access-vps4q") pod "79f55d1d-3c22-43be-84f4-9cef2bb268f0" (UID: "79f55d1d-3c22-43be-84f4-9cef2bb268f0"). InnerVolumeSpecName "kube-api-access-vps4q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:33:03 crc kubenswrapper[5116]: I0322 00:33:03.333253 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vps4q\" (UniqueName: \"kubernetes.io/projected/79f55d1d-3c22-43be-84f4-9cef2bb268f0-kube-api-access-vps4q\") on node \"crc\" DevicePath \"\"" Mar 22 00:33:03 crc kubenswrapper[5116]: I0322 00:33:03.663384 5116 generic.go:358] "Generic (PLEG): container finished" podID="79f55d1d-3c22-43be-84f4-9cef2bb268f0" containerID="adf459e10bd634af861991406392ca3f3cc87c5dca73ec4feb3e7454496bf6bb" exitCode=0 Mar 22 00:33:03 crc kubenswrapper[5116]: I0322 00:33:03.663595 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-pbfzj" event={"ID":"79f55d1d-3c22-43be-84f4-9cef2bb268f0","Type":"ContainerDied","Data":"adf459e10bd634af861991406392ca3f3cc87c5dca73ec4feb3e7454496bf6bb"} Mar 22 00:33:03 crc kubenswrapper[5116]: I0322 00:33:03.663659 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-pbfzj" event={"ID":"79f55d1d-3c22-43be-84f4-9cef2bb268f0","Type":"ContainerDied","Data":"03d28f35ce77fb34c75f7781589e64a095acb527fdbeda78c9d0c617287e2d1f"} Mar 22 00:33:03 crc kubenswrapper[5116]: I0322 00:33:03.663690 5116 scope.go:117] "RemoveContainer" containerID="adf459e10bd634af861991406392ca3f3cc87c5dca73ec4feb3e7454496bf6bb" Mar 22 00:33:03 crc kubenswrapper[5116]: I0322 00:33:03.663567 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-pbfzj" Mar 22 00:33:03 crc kubenswrapper[5116]: I0322 00:33:03.695624 5116 scope.go:117] "RemoveContainer" containerID="adf459e10bd634af861991406392ca3f3cc87c5dca73ec4feb3e7454496bf6bb" Mar 22 00:33:03 crc kubenswrapper[5116]: E0322 00:33:03.696139 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adf459e10bd634af861991406392ca3f3cc87c5dca73ec4feb3e7454496bf6bb\": container with ID starting with adf459e10bd634af861991406392ca3f3cc87c5dca73ec4feb3e7454496bf6bb not found: ID does not exist" containerID="adf459e10bd634af861991406392ca3f3cc87c5dca73ec4feb3e7454496bf6bb" Mar 22 00:33:03 crc kubenswrapper[5116]: I0322 00:33:03.696229 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adf459e10bd634af861991406392ca3f3cc87c5dca73ec4feb3e7454496bf6bb"} err="failed to get container status \"adf459e10bd634af861991406392ca3f3cc87c5dca73ec4feb3e7454496bf6bb\": rpc error: code = NotFound desc = could not find container \"adf459e10bd634af861991406392ca3f3cc87c5dca73ec4feb3e7454496bf6bb\": container with ID starting with adf459e10bd634af861991406392ca3f3cc87c5dca73ec4feb3e7454496bf6bb not found: ID does not exist" Mar 22 00:33:03 crc kubenswrapper[5116]: I0322 00:33:03.726790 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-pbfzj"] Mar 22 00:33:03 crc kubenswrapper[5116]: I0322 00:33:03.737321 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["service-telemetry/infrawatch-operators-pbfzj"] Mar 22 00:33:03 crc kubenswrapper[5116]: E0322 00:33:03.833812 5116 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79f55d1d_3c22_43be_84f4_9cef2bb268f0.slice\": RecentStats: unable to find data in memory cache]" Mar 22 00:33:05 crc kubenswrapper[5116]: I0322 00:33:05.706153 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79f55d1d-3c22-43be-84f4-9cef2bb268f0" path="/var/lib/kubelet/pods/79f55d1d-3c22-43be-84f4-9cef2bb268f0/volumes" Mar 22 00:33:08 crc kubenswrapper[5116]: I0322 00:33:08.827311 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/infrawatch-operators-g5xjt" Mar 22 00:33:08 crc kubenswrapper[5116]: I0322 00:33:08.827703 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="service-telemetry/infrawatch-operators-g5xjt" Mar 22 00:33:08 crc kubenswrapper[5116]: I0322 00:33:08.862703 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/infrawatch-operators-g5xjt" Mar 22 00:33:09 crc kubenswrapper[5116]: I0322 00:33:09.767374 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/infrawatch-operators-g5xjt" Mar 22 00:33:23 crc kubenswrapper[5116]: I0322 00:33:23.057494 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:33:23 crc kubenswrapper[5116]: I0322 00:33:23.057979 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:33:25 crc kubenswrapper[5116]: I0322 00:33:25.502524 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq"] Mar 22 00:33:25 crc kubenswrapper[5116]: I0322 00:33:25.505040 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="79f55d1d-3c22-43be-84f4-9cef2bb268f0" containerName="registry-server" Mar 22 00:33:25 crc kubenswrapper[5116]: I0322 00:33:25.505228 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="79f55d1d-3c22-43be-84f4-9cef2bb268f0" containerName="registry-server" Mar 22 00:33:25 crc kubenswrapper[5116]: I0322 00:33:25.505616 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="79f55d1d-3c22-43be-84f4-9cef2bb268f0" containerName="registry-server" Mar 22 00:33:25 crc kubenswrapper[5116]: I0322 00:33:25.517788 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq"] Mar 22 00:33:25 crc kubenswrapper[5116]: I0322 00:33:25.517972 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" Mar 22 00:33:25 crc kubenswrapper[5116]: I0322 00:33:25.543274 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4fc23f04-b941-4ba1-877b-076f0f27569a-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq\" (UID: \"4fc23f04-b941-4ba1-877b-076f0f27569a\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" Mar 22 00:33:25 crc kubenswrapper[5116]: I0322 00:33:25.543383 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjspz\" (UniqueName: \"kubernetes.io/projected/4fc23f04-b941-4ba1-877b-076f0f27569a-kube-api-access-sjspz\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq\" (UID: \"4fc23f04-b941-4ba1-877b-076f0f27569a\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" Mar 22 00:33:25 crc kubenswrapper[5116]: I0322 00:33:25.543527 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4fc23f04-b941-4ba1-877b-076f0f27569a-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq\" (UID: \"4fc23f04-b941-4ba1-877b-076f0f27569a\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" Mar 22 00:33:25 crc kubenswrapper[5116]: I0322 00:33:25.645249 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4fc23f04-b941-4ba1-877b-076f0f27569a-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq\" (UID: \"4fc23f04-b941-4ba1-877b-076f0f27569a\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" Mar 22 00:33:25 crc kubenswrapper[5116]: I0322 00:33:25.645384 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4fc23f04-b941-4ba1-877b-076f0f27569a-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq\" (UID: \"4fc23f04-b941-4ba1-877b-076f0f27569a\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" Mar 22 00:33:25 crc kubenswrapper[5116]: I0322 00:33:25.645445 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjspz\" (UniqueName: \"kubernetes.io/projected/4fc23f04-b941-4ba1-877b-076f0f27569a-kube-api-access-sjspz\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq\" (UID: \"4fc23f04-b941-4ba1-877b-076f0f27569a\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" Mar 22 00:33:25 crc kubenswrapper[5116]: I0322 00:33:25.645751 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4fc23f04-b941-4ba1-877b-076f0f27569a-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq\" (UID: \"4fc23f04-b941-4ba1-877b-076f0f27569a\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" Mar 22 00:33:25 crc kubenswrapper[5116]: I0322 00:33:25.645826 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4fc23f04-b941-4ba1-877b-076f0f27569a-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq\" (UID: \"4fc23f04-b941-4ba1-877b-076f0f27569a\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" Mar 22 00:33:25 crc kubenswrapper[5116]: I0322 00:33:25.666074 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjspz\" (UniqueName: \"kubernetes.io/projected/4fc23f04-b941-4ba1-877b-076f0f27569a-kube-api-access-sjspz\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq\" (UID: \"4fc23f04-b941-4ba1-877b-076f0f27569a\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" Mar 22 00:33:25 crc kubenswrapper[5116]: I0322 00:33:25.863604 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.088523 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq"] Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.297204 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph"] Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.318553 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph"] Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.318839 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.357512 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph\" (UID: \"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.357593 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn226\" (UniqueName: \"kubernetes.io/projected/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-kube-api-access-dn226\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph\" (UID: \"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.357669 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph\" (UID: \"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.458594 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph\" (UID: \"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.458675 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph\" (UID: \"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.458729 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dn226\" (UniqueName: \"kubernetes.io/projected/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-kube-api-access-dn226\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph\" (UID: \"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.459150 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph\" (UID: \"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.459254 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph\" (UID: \"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.481360 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn226\" (UniqueName: \"kubernetes.io/projected/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-kube-api-access-dn226\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph\" (UID: \"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.632480 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.860432 5116 generic.go:358] "Generic (PLEG): container finished" podID="4fc23f04-b941-4ba1-877b-076f0f27569a" containerID="b474626d7ff0ec5a95ee881e6ec583483802dd233c57a9ca2ca5510d36b5d845" exitCode=0 Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.860489 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" event={"ID":"4fc23f04-b941-4ba1-877b-076f0f27569a","Type":"ContainerDied","Data":"b474626d7ff0ec5a95ee881e6ec583483802dd233c57a9ca2ca5510d36b5d845"} Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.860845 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" event={"ID":"4fc23f04-b941-4ba1-877b-076f0f27569a","Type":"ContainerStarted","Data":"a87780d2b4bac827afdf7443e6daa4e225ba7f3f20729ab614d7fe7e816d18bf"} Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.916935 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph"] Mar 22 00:33:26 crc kubenswrapper[5116]: W0322 00:33:26.921592 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02e0ed20_34bb_4aa9_a4c8_89dcbd6c3849.slice/crio-c25ede99fbecacb36309b5ab19ceeac97bc7d5bda3d415e16de1216927b55326 WatchSource:0}: Error finding container c25ede99fbecacb36309b5ab19ceeac97bc7d5bda3d415e16de1216927b55326: Status 404 returned error can't find the container with id c25ede99fbecacb36309b5ab19ceeac97bc7d5bda3d415e16de1216927b55326 Mar 22 00:33:27 crc kubenswrapper[5116]: I0322 00:33:27.872038 5116 generic.go:358] "Generic (PLEG): container finished" podID="4fc23f04-b941-4ba1-877b-076f0f27569a" containerID="3f4e40260b8dbda06424c155eee70b7f1a5cb6d29b68c5b72bee79ab8f75cd36" exitCode=0 Mar 22 00:33:27 crc kubenswrapper[5116]: I0322 00:33:27.872096 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" event={"ID":"4fc23f04-b941-4ba1-877b-076f0f27569a","Type":"ContainerDied","Data":"3f4e40260b8dbda06424c155eee70b7f1a5cb6d29b68c5b72bee79ab8f75cd36"} Mar 22 00:33:27 crc kubenswrapper[5116]: I0322 00:33:27.874617 5116 generic.go:358] "Generic (PLEG): container finished" podID="02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849" containerID="d07efa1469944ab7ce5272e63bd6eb423b16d83caaed4ab51b9ddd7648ab0ace" exitCode=0 Mar 22 00:33:27 crc kubenswrapper[5116]: I0322 00:33:27.874671 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" event={"ID":"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849","Type":"ContainerDied","Data":"d07efa1469944ab7ce5272e63bd6eb423b16d83caaed4ab51b9ddd7648ab0ace"} Mar 22 00:33:27 crc kubenswrapper[5116]: I0322 00:33:27.874695 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" event={"ID":"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849","Type":"ContainerStarted","Data":"c25ede99fbecacb36309b5ab19ceeac97bc7d5bda3d415e16de1216927b55326"} Mar 22 00:33:28 crc kubenswrapper[5116]: I0322 00:33:28.886836 5116 generic.go:358] "Generic (PLEG): container finished" podID="4fc23f04-b941-4ba1-877b-076f0f27569a" containerID="763c994a37e899cabf039fd393317e3c5c41cfac5090cae3eb617ae14452b664" exitCode=0 Mar 22 00:33:28 crc kubenswrapper[5116]: I0322 00:33:28.886964 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" event={"ID":"4fc23f04-b941-4ba1-877b-076f0f27569a","Type":"ContainerDied","Data":"763c994a37e899cabf039fd393317e3c5c41cfac5090cae3eb617ae14452b664"} Mar 22 00:33:28 crc kubenswrapper[5116]: I0322 00:33:28.890667 5116 generic.go:358] "Generic (PLEG): container finished" podID="02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849" containerID="21f8b92f22c044dccc23a9859663e13d5fdfdc8158900fe15d4d76809e28bf7a" exitCode=0 Mar 22 00:33:28 crc kubenswrapper[5116]: I0322 00:33:28.890714 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" event={"ID":"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849","Type":"ContainerDied","Data":"21f8b92f22c044dccc23a9859663e13d5fdfdc8158900fe15d4d76809e28bf7a"} Mar 22 00:33:29 crc kubenswrapper[5116]: I0322 00:33:29.902010 5116 generic.go:358] "Generic (PLEG): container finished" podID="02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849" containerID="9f47b38185c3654b673c989e5c9b0bc8649578fd9e8d08fe1fb50b7b03c3dd7c" exitCode=0 Mar 22 00:33:29 crc kubenswrapper[5116]: I0322 00:33:29.902140 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" event={"ID":"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849","Type":"ContainerDied","Data":"9f47b38185c3654b673c989e5c9b0bc8649578fd9e8d08fe1fb50b7b03c3dd7c"} Mar 22 00:33:30 crc kubenswrapper[5116]: I0322 00:33:30.163068 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" Mar 22 00:33:30 crc kubenswrapper[5116]: I0322 00:33:30.215021 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4fc23f04-b941-4ba1-877b-076f0f27569a-bundle\") pod \"4fc23f04-b941-4ba1-877b-076f0f27569a\" (UID: \"4fc23f04-b941-4ba1-877b-076f0f27569a\") " Mar 22 00:33:30 crc kubenswrapper[5116]: I0322 00:33:30.215114 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4fc23f04-b941-4ba1-877b-076f0f27569a-util\") pod \"4fc23f04-b941-4ba1-877b-076f0f27569a\" (UID: \"4fc23f04-b941-4ba1-877b-076f0f27569a\") " Mar 22 00:33:30 crc kubenswrapper[5116]: I0322 00:33:30.215224 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjspz\" (UniqueName: \"kubernetes.io/projected/4fc23f04-b941-4ba1-877b-076f0f27569a-kube-api-access-sjspz\") pod \"4fc23f04-b941-4ba1-877b-076f0f27569a\" (UID: \"4fc23f04-b941-4ba1-877b-076f0f27569a\") " Mar 22 00:33:30 crc kubenswrapper[5116]: I0322 00:33:30.216783 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fc23f04-b941-4ba1-877b-076f0f27569a-bundle" (OuterVolumeSpecName: "bundle") pod "4fc23f04-b941-4ba1-877b-076f0f27569a" (UID: "4fc23f04-b941-4ba1-877b-076f0f27569a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:33:30 crc kubenswrapper[5116]: I0322 00:33:30.223477 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fc23f04-b941-4ba1-877b-076f0f27569a-kube-api-access-sjspz" (OuterVolumeSpecName: "kube-api-access-sjspz") pod "4fc23f04-b941-4ba1-877b-076f0f27569a" (UID: "4fc23f04-b941-4ba1-877b-076f0f27569a"). InnerVolumeSpecName "kube-api-access-sjspz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:33:30 crc kubenswrapper[5116]: I0322 00:33:30.227779 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fc23f04-b941-4ba1-877b-076f0f27569a-util" (OuterVolumeSpecName: "util") pod "4fc23f04-b941-4ba1-877b-076f0f27569a" (UID: "4fc23f04-b941-4ba1-877b-076f0f27569a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:33:30 crc kubenswrapper[5116]: I0322 00:33:30.316781 5116 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4fc23f04-b941-4ba1-877b-076f0f27569a-bundle\") on node \"crc\" DevicePath \"\"" Mar 22 00:33:30 crc kubenswrapper[5116]: I0322 00:33:30.316827 5116 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4fc23f04-b941-4ba1-877b-076f0f27569a-util\") on node \"crc\" DevicePath \"\"" Mar 22 00:33:30 crc kubenswrapper[5116]: I0322 00:33:30.316840 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sjspz\" (UniqueName: \"kubernetes.io/projected/4fc23f04-b941-4ba1-877b-076f0f27569a-kube-api-access-sjspz\") on node \"crc\" DevicePath \"\"" Mar 22 00:33:30 crc kubenswrapper[5116]: I0322 00:33:30.914094 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" event={"ID":"4fc23f04-b941-4ba1-877b-076f0f27569a","Type":"ContainerDied","Data":"a87780d2b4bac827afdf7443e6daa4e225ba7f3f20729ab614d7fe7e816d18bf"} Mar 22 00:33:30 crc kubenswrapper[5116]: I0322 00:33:30.914344 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a87780d2b4bac827afdf7443e6daa4e225ba7f3f20729ab614d7fe7e816d18bf" Mar 22 00:33:30 crc kubenswrapper[5116]: I0322 00:33:30.914119 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" Mar 22 00:33:31 crc kubenswrapper[5116]: I0322 00:33:31.172664 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" Mar 22 00:33:31 crc kubenswrapper[5116]: I0322 00:33:31.231452 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-bundle\") pod \"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849\" (UID: \"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849\") " Mar 22 00:33:31 crc kubenswrapper[5116]: I0322 00:33:31.231647 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dn226\" (UniqueName: \"kubernetes.io/projected/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-kube-api-access-dn226\") pod \"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849\" (UID: \"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849\") " Mar 22 00:33:31 crc kubenswrapper[5116]: I0322 00:33:31.231709 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-util\") pod \"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849\" (UID: \"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849\") " Mar 22 00:33:31 crc kubenswrapper[5116]: I0322 00:33:31.232392 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-bundle" (OuterVolumeSpecName: "bundle") pod "02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849" (UID: "02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:33:31 crc kubenswrapper[5116]: I0322 00:33:31.236492 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-kube-api-access-dn226" (OuterVolumeSpecName: "kube-api-access-dn226") pod "02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849" (UID: "02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849"). InnerVolumeSpecName "kube-api-access-dn226". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:33:31 crc kubenswrapper[5116]: I0322 00:33:31.250328 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-util" (OuterVolumeSpecName: "util") pod "02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849" (UID: "02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:33:31 crc kubenswrapper[5116]: I0322 00:33:31.333580 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dn226\" (UniqueName: \"kubernetes.io/projected/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-kube-api-access-dn226\") on node \"crc\" DevicePath \"\"" Mar 22 00:33:31 crc kubenswrapper[5116]: I0322 00:33:31.333617 5116 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-util\") on node \"crc\" DevicePath \"\"" Mar 22 00:33:31 crc kubenswrapper[5116]: I0322 00:33:31.333628 5116 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-bundle\") on node \"crc\" DevicePath \"\"" Mar 22 00:33:31 crc kubenswrapper[5116]: I0322 00:33:31.923877 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" event={"ID":"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849","Type":"ContainerDied","Data":"c25ede99fbecacb36309b5ab19ceeac97bc7d5bda3d415e16de1216927b55326"} Mar 22 00:33:31 crc kubenswrapper[5116]: I0322 00:33:31.924267 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c25ede99fbecacb36309b5ab19ceeac97bc7d5bda3d415e16de1216927b55326" Mar 22 00:33:31 crc kubenswrapper[5116]: I0322 00:33:31.923915 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.410830 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-55b77595c-67kjz"] Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.411871 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4fc23f04-b941-4ba1-877b-076f0f27569a" containerName="util" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.411886 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fc23f04-b941-4ba1-877b-076f0f27569a" containerName="util" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.411900 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849" containerName="util" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.411905 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849" containerName="util" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.411912 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4fc23f04-b941-4ba1-877b-076f0f27569a" containerName="pull" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.411918 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fc23f04-b941-4ba1-877b-076f0f27569a" containerName="pull" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.411936 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4fc23f04-b941-4ba1-877b-076f0f27569a" containerName="extract" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.411941 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fc23f04-b941-4ba1-877b-076f0f27569a" containerName="extract" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.411957 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849" containerName="pull" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.411962 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849" containerName="pull" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.411978 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849" containerName="extract" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.411982 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849" containerName="extract" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.412066 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="4fc23f04-b941-4ba1-877b-076f0f27569a" containerName="extract" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.412074 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849" containerName="extract" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.431655 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-55b77595c-67kjz"] Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.431806 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-55b77595c-67kjz" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.435240 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"service-telemetry-operator-dockercfg-xzx4z\"" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.477320 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc8nj\" (UniqueName: \"kubernetes.io/projected/fcfe47c3-284e-4c8b-b35f-06a0499313a5-kube-api-access-hc8nj\") pod \"service-telemetry-operator-55b77595c-67kjz\" (UID: \"fcfe47c3-284e-4c8b-b35f-06a0499313a5\") " pod="service-telemetry/service-telemetry-operator-55b77595c-67kjz" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.477389 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/fcfe47c3-284e-4c8b-b35f-06a0499313a5-runner\") pod \"service-telemetry-operator-55b77595c-67kjz\" (UID: \"fcfe47c3-284e-4c8b-b35f-06a0499313a5\") " pod="service-telemetry/service-telemetry-operator-55b77595c-67kjz" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.578962 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hc8nj\" (UniqueName: \"kubernetes.io/projected/fcfe47c3-284e-4c8b-b35f-06a0499313a5-kube-api-access-hc8nj\") pod \"service-telemetry-operator-55b77595c-67kjz\" (UID: \"fcfe47c3-284e-4c8b-b35f-06a0499313a5\") " pod="service-telemetry/service-telemetry-operator-55b77595c-67kjz" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.579017 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/fcfe47c3-284e-4c8b-b35f-06a0499313a5-runner\") pod \"service-telemetry-operator-55b77595c-67kjz\" (UID: \"fcfe47c3-284e-4c8b-b35f-06a0499313a5\") " pod="service-telemetry/service-telemetry-operator-55b77595c-67kjz" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.579447 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/fcfe47c3-284e-4c8b-b35f-06a0499313a5-runner\") pod \"service-telemetry-operator-55b77595c-67kjz\" (UID: \"fcfe47c3-284e-4c8b-b35f-06a0499313a5\") " pod="service-telemetry/service-telemetry-operator-55b77595c-67kjz" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.596254 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc8nj\" (UniqueName: \"kubernetes.io/projected/fcfe47c3-284e-4c8b-b35f-06a0499313a5-kube-api-access-hc8nj\") pod \"service-telemetry-operator-55b77595c-67kjz\" (UID: \"fcfe47c3-284e-4c8b-b35f-06a0499313a5\") " pod="service-telemetry/service-telemetry-operator-55b77595c-67kjz" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.774434 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-55b77595c-67kjz" Mar 22 00:33:35 crc kubenswrapper[5116]: I0322 00:33:35.006417 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-55b77595c-67kjz"] Mar 22 00:33:35 crc kubenswrapper[5116]: W0322 00:33:35.024562 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcfe47c3_284e_4c8b_b35f_06a0499313a5.slice/crio-e711f3f65c1651ad702f4b50acc3651680881daaf5d35d0eb5ea82737d63871a WatchSource:0}: Error finding container e711f3f65c1651ad702f4b50acc3651680881daaf5d35d0eb5ea82737d63871a: Status 404 returned error can't find the container with id e711f3f65c1651ad702f4b50acc3651680881daaf5d35d0eb5ea82737d63871a Mar 22 00:33:35 crc kubenswrapper[5116]: I0322 00:33:35.847528 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-564975b589-lf9hg"] Mar 22 00:33:35 crc kubenswrapper[5116]: I0322 00:33:35.862260 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-564975b589-lf9hg"] Mar 22 00:33:35 crc kubenswrapper[5116]: I0322 00:33:35.862392 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-564975b589-lf9hg" Mar 22 00:33:35 crc kubenswrapper[5116]: I0322 00:33:35.865486 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"smart-gateway-operator-dockercfg-bh5jq\"" Mar 22 00:33:35 crc kubenswrapper[5116]: I0322 00:33:35.963556 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-55b77595c-67kjz" event={"ID":"fcfe47c3-284e-4c8b-b35f-06a0499313a5","Type":"ContainerStarted","Data":"e711f3f65c1651ad702f4b50acc3651680881daaf5d35d0eb5ea82737d63871a"} Mar 22 00:33:35 crc kubenswrapper[5116]: I0322 00:33:35.996453 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/5f8e2cff-6459-4aae-8cf6-a48587311f68-runner\") pod \"smart-gateway-operator-564975b589-lf9hg\" (UID: \"5f8e2cff-6459-4aae-8cf6-a48587311f68\") " pod="service-telemetry/smart-gateway-operator-564975b589-lf9hg" Mar 22 00:33:35 crc kubenswrapper[5116]: I0322 00:33:35.996607 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw42l\" (UniqueName: \"kubernetes.io/projected/5f8e2cff-6459-4aae-8cf6-a48587311f68-kube-api-access-cw42l\") pod \"smart-gateway-operator-564975b589-lf9hg\" (UID: \"5f8e2cff-6459-4aae-8cf6-a48587311f68\") " pod="service-telemetry/smart-gateway-operator-564975b589-lf9hg" Mar 22 00:33:36 crc kubenswrapper[5116]: I0322 00:33:36.099084 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/5f8e2cff-6459-4aae-8cf6-a48587311f68-runner\") pod \"smart-gateway-operator-564975b589-lf9hg\" (UID: \"5f8e2cff-6459-4aae-8cf6-a48587311f68\") " pod="service-telemetry/smart-gateway-operator-564975b589-lf9hg" Mar 22 00:33:36 crc kubenswrapper[5116]: I0322 00:33:36.099348 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cw42l\" (UniqueName: \"kubernetes.io/projected/5f8e2cff-6459-4aae-8cf6-a48587311f68-kube-api-access-cw42l\") pod \"smart-gateway-operator-564975b589-lf9hg\" (UID: \"5f8e2cff-6459-4aae-8cf6-a48587311f68\") " pod="service-telemetry/smart-gateway-operator-564975b589-lf9hg" Mar 22 00:33:36 crc kubenswrapper[5116]: I0322 00:33:36.100031 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/5f8e2cff-6459-4aae-8cf6-a48587311f68-runner\") pod \"smart-gateway-operator-564975b589-lf9hg\" (UID: \"5f8e2cff-6459-4aae-8cf6-a48587311f68\") " pod="service-telemetry/smart-gateway-operator-564975b589-lf9hg" Mar 22 00:33:36 crc kubenswrapper[5116]: I0322 00:33:36.143124 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw42l\" (UniqueName: \"kubernetes.io/projected/5f8e2cff-6459-4aae-8cf6-a48587311f68-kube-api-access-cw42l\") pod \"smart-gateway-operator-564975b589-lf9hg\" (UID: \"5f8e2cff-6459-4aae-8cf6-a48587311f68\") " pod="service-telemetry/smart-gateway-operator-564975b589-lf9hg" Mar 22 00:33:36 crc kubenswrapper[5116]: I0322 00:33:36.188650 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-564975b589-lf9hg" Mar 22 00:33:36 crc kubenswrapper[5116]: I0322 00:33:36.615366 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-564975b589-lf9hg"] Mar 22 00:33:36 crc kubenswrapper[5116]: W0322 00:33:36.630613 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f8e2cff_6459_4aae_8cf6_a48587311f68.slice/crio-ddfdbba80b737e24522a2072d06990961fde6df9b9a306b06325a4922f7c5b24 WatchSource:0}: Error finding container ddfdbba80b737e24522a2072d06990961fde6df9b9a306b06325a4922f7c5b24: Status 404 returned error can't find the container with id ddfdbba80b737e24522a2072d06990961fde6df9b9a306b06325a4922f7c5b24 Mar 22 00:33:36 crc kubenswrapper[5116]: I0322 00:33:36.993266 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-564975b589-lf9hg" event={"ID":"5f8e2cff-6459-4aae-8cf6-a48587311f68","Type":"ContainerStarted","Data":"ddfdbba80b737e24522a2072d06990961fde6df9b9a306b06325a4922f7c5b24"} Mar 22 00:33:53 crc kubenswrapper[5116]: I0322 00:33:53.056902 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:33:53 crc kubenswrapper[5116]: I0322 00:33:53.057253 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:33:56 crc kubenswrapper[5116]: I0322 00:33:56.113995 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9sq6c_5188f25b-37c3-46f1-b939-199c6e082848/kube-multus/0.log" Mar 22 00:33:56 crc kubenswrapper[5116]: I0322 00:33:56.119841 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9sq6c_5188f25b-37c3-46f1-b939-199c6e082848/kube-multus/0.log" Mar 22 00:33:56 crc kubenswrapper[5116]: I0322 00:33:56.134250 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 22 00:33:56 crc kubenswrapper[5116]: I0322 00:33:56.137508 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 22 00:33:57 crc kubenswrapper[5116]: I0322 00:33:57.161558 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-55b77595c-67kjz" event={"ID":"fcfe47c3-284e-4c8b-b35f-06a0499313a5","Type":"ContainerStarted","Data":"dad31067a41f6fc3033d6f01d5d1b9e7b4f361746cc24e0263c7d437271c296e"} Mar 22 00:33:57 crc kubenswrapper[5116]: I0322 00:33:57.163933 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-564975b589-lf9hg" event={"ID":"5f8e2cff-6459-4aae-8cf6-a48587311f68","Type":"ContainerStarted","Data":"0b6bffef25f9036758329187792f9e0fe15eaad0e6870e9c13b26f13baff06fa"} Mar 22 00:33:57 crc kubenswrapper[5116]: I0322 00:33:57.184763 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-55b77595c-67kjz" podStartSLOduration=2.090863636 podStartE2EDuration="23.1847485s" podCreationTimestamp="2026-03-22 00:33:34 +0000 UTC" firstStartedPulling="2026-03-22 00:33:35.027118425 +0000 UTC m=+1486.049419798" lastFinishedPulling="2026-03-22 00:33:56.121003299 +0000 UTC m=+1507.143304662" observedRunningTime="2026-03-22 00:33:57.183406448 +0000 UTC m=+1508.205707821" watchObservedRunningTime="2026-03-22 00:33:57.1847485 +0000 UTC m=+1508.207049873" Mar 22 00:34:00 crc kubenswrapper[5116]: I0322 00:34:00.153849 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-564975b589-lf9hg" podStartSLOduration=5.646590447 podStartE2EDuration="25.153819694s" podCreationTimestamp="2026-03-22 00:33:35 +0000 UTC" firstStartedPulling="2026-03-22 00:33:36.63257059 +0000 UTC m=+1487.654871973" lastFinishedPulling="2026-03-22 00:33:56.139799857 +0000 UTC m=+1507.162101220" observedRunningTime="2026-03-22 00:33:57.204906922 +0000 UTC m=+1508.227208325" watchObservedRunningTime="2026-03-22 00:34:00.153819694 +0000 UTC m=+1511.176121077" Mar 22 00:34:00 crc kubenswrapper[5116]: I0322 00:34:00.154824 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29568994-tx27l"] Mar 22 00:34:00 crc kubenswrapper[5116]: I0322 00:34:00.198953 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568994-tx27l"] Mar 22 00:34:00 crc kubenswrapper[5116]: I0322 00:34:00.199110 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568994-tx27l" Mar 22 00:34:00 crc kubenswrapper[5116]: I0322 00:34:00.201396 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 22 00:34:00 crc kubenswrapper[5116]: I0322 00:34:00.203240 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-zsw2q\"" Mar 22 00:34:00 crc kubenswrapper[5116]: I0322 00:34:00.203417 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 22 00:34:00 crc kubenswrapper[5116]: I0322 00:34:00.264449 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5xkw\" (UniqueName: \"kubernetes.io/projected/a7f4a688-3ca1-4538-9b16-323899848ec1-kube-api-access-r5xkw\") pod \"auto-csr-approver-29568994-tx27l\" (UID: \"a7f4a688-3ca1-4538-9b16-323899848ec1\") " pod="openshift-infra/auto-csr-approver-29568994-tx27l" Mar 22 00:34:00 crc kubenswrapper[5116]: I0322 00:34:00.366218 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5xkw\" (UniqueName: \"kubernetes.io/projected/a7f4a688-3ca1-4538-9b16-323899848ec1-kube-api-access-r5xkw\") pod \"auto-csr-approver-29568994-tx27l\" (UID: \"a7f4a688-3ca1-4538-9b16-323899848ec1\") " pod="openshift-infra/auto-csr-approver-29568994-tx27l" Mar 22 00:34:00 crc kubenswrapper[5116]: I0322 00:34:00.390801 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5xkw\" (UniqueName: \"kubernetes.io/projected/a7f4a688-3ca1-4538-9b16-323899848ec1-kube-api-access-r5xkw\") pod \"auto-csr-approver-29568994-tx27l\" (UID: \"a7f4a688-3ca1-4538-9b16-323899848ec1\") " pod="openshift-infra/auto-csr-approver-29568994-tx27l" Mar 22 00:34:00 crc kubenswrapper[5116]: I0322 00:34:00.516012 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568994-tx27l" Mar 22 00:34:00 crc kubenswrapper[5116]: I0322 00:34:00.984033 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568994-tx27l"] Mar 22 00:34:00 crc kubenswrapper[5116]: W0322 00:34:00.987416 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7f4a688_3ca1_4538_9b16_323899848ec1.slice/crio-7ee80db9b009c402d8f94909edc7814c40d12193ccaf7f41dc54a94dd06f1a7a WatchSource:0}: Error finding container 7ee80db9b009c402d8f94909edc7814c40d12193ccaf7f41dc54a94dd06f1a7a: Status 404 returned error can't find the container with id 7ee80db9b009c402d8f94909edc7814c40d12193ccaf7f41dc54a94dd06f1a7a Mar 22 00:34:01 crc kubenswrapper[5116]: I0322 00:34:01.196731 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568994-tx27l" event={"ID":"a7f4a688-3ca1-4538-9b16-323899848ec1","Type":"ContainerStarted","Data":"7ee80db9b009c402d8f94909edc7814c40d12193ccaf7f41dc54a94dd06f1a7a"} Mar 22 00:34:02 crc kubenswrapper[5116]: I0322 00:34:02.207081 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568994-tx27l" event={"ID":"a7f4a688-3ca1-4538-9b16-323899848ec1","Type":"ContainerStarted","Data":"1109d2350dd4889e7482fa6160c73f85e35ef58464b38a92e5a8ce5b9d9fcea4"} Mar 22 00:34:02 crc kubenswrapper[5116]: I0322 00:34:02.227217 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29568994-tx27l" podStartSLOduration=1.30926425 podStartE2EDuration="2.227196772s" podCreationTimestamp="2026-03-22 00:34:00 +0000 UTC" firstStartedPulling="2026-03-22 00:34:00.989738748 +0000 UTC m=+1512.012040121" lastFinishedPulling="2026-03-22 00:34:01.90767127 +0000 UTC m=+1512.929972643" observedRunningTime="2026-03-22 00:34:02.225074066 +0000 UTC m=+1513.247375449" watchObservedRunningTime="2026-03-22 00:34:02.227196772 +0000 UTC m=+1513.249498155" Mar 22 00:34:03 crc kubenswrapper[5116]: I0322 00:34:03.217111 5116 generic.go:358] "Generic (PLEG): container finished" podID="a7f4a688-3ca1-4538-9b16-323899848ec1" containerID="1109d2350dd4889e7482fa6160c73f85e35ef58464b38a92e5a8ce5b9d9fcea4" exitCode=0 Mar 22 00:34:03 crc kubenswrapper[5116]: I0322 00:34:03.217229 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568994-tx27l" event={"ID":"a7f4a688-3ca1-4538-9b16-323899848ec1","Type":"ContainerDied","Data":"1109d2350dd4889e7482fa6160c73f85e35ef58464b38a92e5a8ce5b9d9fcea4"} Mar 22 00:34:04 crc kubenswrapper[5116]: I0322 00:34:04.568925 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568994-tx27l" Mar 22 00:34:04 crc kubenswrapper[5116]: I0322 00:34:04.736486 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5xkw\" (UniqueName: \"kubernetes.io/projected/a7f4a688-3ca1-4538-9b16-323899848ec1-kube-api-access-r5xkw\") pod \"a7f4a688-3ca1-4538-9b16-323899848ec1\" (UID: \"a7f4a688-3ca1-4538-9b16-323899848ec1\") " Mar 22 00:34:04 crc kubenswrapper[5116]: I0322 00:34:04.743050 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7f4a688-3ca1-4538-9b16-323899848ec1-kube-api-access-r5xkw" (OuterVolumeSpecName: "kube-api-access-r5xkw") pod "a7f4a688-3ca1-4538-9b16-323899848ec1" (UID: "a7f4a688-3ca1-4538-9b16-323899848ec1"). InnerVolumeSpecName "kube-api-access-r5xkw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:34:04 crc kubenswrapper[5116]: I0322 00:34:04.839568 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r5xkw\" (UniqueName: \"kubernetes.io/projected/a7f4a688-3ca1-4538-9b16-323899848ec1-kube-api-access-r5xkw\") on node \"crc\" DevicePath \"\"" Mar 22 00:34:05 crc kubenswrapper[5116]: I0322 00:34:05.241021 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568994-tx27l" event={"ID":"a7f4a688-3ca1-4538-9b16-323899848ec1","Type":"ContainerDied","Data":"7ee80db9b009c402d8f94909edc7814c40d12193ccaf7f41dc54a94dd06f1a7a"} Mar 22 00:34:05 crc kubenswrapper[5116]: I0322 00:34:05.241076 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ee80db9b009c402d8f94909edc7814c40d12193ccaf7f41dc54a94dd06f1a7a" Mar 22 00:34:05 crc kubenswrapper[5116]: I0322 00:34:05.241111 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568994-tx27l" Mar 22 00:34:05 crc kubenswrapper[5116]: I0322 00:34:05.286402 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29568988-gbkp8"] Mar 22 00:34:05 crc kubenswrapper[5116]: I0322 00:34:05.305686 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29568988-gbkp8"] Mar 22 00:34:05 crc kubenswrapper[5116]: I0322 00:34:05.713015 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="135f0dfe-78e2-4264-ae7b-7d6b95ebbb39" path="/var/lib/kubelet/pods/135f0dfe-78e2-4264-ae7b-7d6b95ebbb39/volumes" Mar 22 00:34:21 crc kubenswrapper[5116]: I0322 00:34:21.959208 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-55bf8d5cb-r7j5z"] Mar 22 00:34:21 crc kubenswrapper[5116]: I0322 00:34:21.960504 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7f4a688-3ca1-4538-9b16-323899848ec1" containerName="oc" Mar 22 00:34:21 crc kubenswrapper[5116]: I0322 00:34:21.960523 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7f4a688-3ca1-4538-9b16-323899848ec1" containerName="oc" Mar 22 00:34:21 crc kubenswrapper[5116]: I0322 00:34:21.960675 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="a7f4a688-3ca1-4538-9b16-323899848ec1" containerName="oc" Mar 22 00:34:21 crc kubenswrapper[5116]: I0322 00:34:21.973560 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:21 crc kubenswrapper[5116]: I0322 00:34:21.978634 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"default-interconnect-openstack-credentials\"" Mar 22 00:34:21 crc kubenswrapper[5116]: I0322 00:34:21.978983 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"default-interconnect-users\"" Mar 22 00:34:21 crc kubenswrapper[5116]: I0322 00:34:21.979024 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"default-interconnect-inter-router-ca\"" Mar 22 00:34:21 crc kubenswrapper[5116]: I0322 00:34:21.979135 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"default-interconnect-dockercfg-cm89f\"" Mar 22 00:34:21 crc kubenswrapper[5116]: I0322 00:34:21.979185 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"default-interconnect-sasl-config\"" Mar 22 00:34:21 crc kubenswrapper[5116]: I0322 00:34:21.979392 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"default-interconnect-openstack-ca\"" Mar 22 00:34:21 crc kubenswrapper[5116]: I0322 00:34:21.980371 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-55bf8d5cb-r7j5z"] Mar 22 00:34:21 crc kubenswrapper[5116]: I0322 00:34:21.982377 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"default-interconnect-inter-router-credentials\"" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.096986 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-inter-router-ca\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.097191 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/e29f610d-8ef1-4992-857d-7b39f8694e44-sasl-config\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.097331 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-inter-router-credentials\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.097388 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-openstack-ca\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.097408 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssw5p\" (UniqueName: \"kubernetes.io/projected/e29f610d-8ef1-4992-857d-7b39f8694e44-kube-api-access-ssw5p\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.097463 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-openstack-credentials\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.097543 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-sasl-users\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.199080 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-openstack-credentials\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.199236 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-sasl-users\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.199289 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-inter-router-ca\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.199388 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/e29f610d-8ef1-4992-857d-7b39f8694e44-sasl-config\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.199444 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-inter-router-credentials\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.199502 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-openstack-ca\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.199536 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ssw5p\" (UniqueName: \"kubernetes.io/projected/e29f610d-8ef1-4992-857d-7b39f8694e44-kube-api-access-ssw5p\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.203885 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/e29f610d-8ef1-4992-857d-7b39f8694e44-sasl-config\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.209050 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-inter-router-ca\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.209275 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-openstack-ca\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.210140 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-sasl-users\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.210267 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-inter-router-credentials\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.228407 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-openstack-credentials\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.234768 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssw5p\" (UniqueName: \"kubernetes.io/projected/e29f610d-8ef1-4992-857d-7b39f8694e44-kube-api-access-ssw5p\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.296204 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.656731 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-55bf8d5cb-r7j5z"] Mar 22 00:34:23 crc kubenswrapper[5116]: I0322 00:34:23.057144 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:34:23 crc kubenswrapper[5116]: I0322 00:34:23.057289 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:34:23 crc kubenswrapper[5116]: I0322 00:34:23.057383 5116 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:34:23 crc kubenswrapper[5116]: I0322 00:34:23.058309 5116 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf"} pod="openshift-machine-config-operator/machine-config-daemon-66g6d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 22 00:34:23 crc kubenswrapper[5116]: I0322 00:34:23.058415 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" containerID="cri-o://d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" gracePeriod=600 Mar 22 00:34:23 crc kubenswrapper[5116]: E0322 00:34:23.235694 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:34:23 crc kubenswrapper[5116]: I0322 00:34:23.431674 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" event={"ID":"e29f610d-8ef1-4992-857d-7b39f8694e44","Type":"ContainerStarted","Data":"bf3991c3ece1f301b0d1fccfff351a4b6d180bae7f391fffa01dbd097f35a02d"} Mar 22 00:34:23 crc kubenswrapper[5116]: I0322 00:34:23.436272 5116 generic.go:358] "Generic (PLEG): container finished" podID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" exitCode=0 Mar 22 00:34:23 crc kubenswrapper[5116]: I0322 00:34:23.436475 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerDied","Data":"d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf"} Mar 22 00:34:23 crc kubenswrapper[5116]: I0322 00:34:23.436534 5116 scope.go:117] "RemoveContainer" containerID="49fe123552b6212becb6925edd45f7eeb8e4b240ca3f762ba113af9cc73657f3" Mar 22 00:34:23 crc kubenswrapper[5116]: I0322 00:34:23.437439 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:34:23 crc kubenswrapper[5116]: E0322 00:34:23.437897 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:34:28 crc kubenswrapper[5116]: I0322 00:34:28.475281 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" event={"ID":"e29f610d-8ef1-4992-857d-7b39f8694e44","Type":"ContainerStarted","Data":"13efdb7b09dbacea618ae6e5e926f99c6af7e489c0aef05534fdc0f2a748af1e"} Mar 22 00:34:28 crc kubenswrapper[5116]: I0322 00:34:28.501331 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" podStartSLOduration=2.548602045 podStartE2EDuration="7.501305245s" podCreationTimestamp="2026-03-22 00:34:21 +0000 UTC" firstStartedPulling="2026-03-22 00:34:22.660007231 +0000 UTC m=+1533.682308604" lastFinishedPulling="2026-03-22 00:34:27.612710431 +0000 UTC m=+1538.635011804" observedRunningTime="2026-03-22 00:34:28.493527732 +0000 UTC m=+1539.515829105" watchObservedRunningTime="2026-03-22 00:34:28.501305245 +0000 UTC m=+1539.523606638" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.082719 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.096589 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.102322 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"serving-certs-ca-bundle\"" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.102522 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"prometheus-default-tls-assets-0\"" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.102337 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"prometheus-default\"" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.106402 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"prometheus-stf-dockercfg-9zlsj\"" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.106739 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"prometheus-default-rulefiles-2\"" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.107659 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"default-prometheus-proxy-tls\"" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.107739 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"prometheus-default-rulefiles-0\"" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.107779 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"prometheus-default-rulefiles-1\"" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.107814 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"default-session-secret\"" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.108028 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"prometheus-default-web-config\"" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.118858 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.256205 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bd2bf44b-01e8-4236-9dda-90998dd75b88-tls-assets\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.256270 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.256421 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/bd2bf44b-01e8-4236-9dda-90998dd75b88-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.256594 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e34b9271-e06c-4d13-9536-482afafbee1a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e34b9271-e06c-4d13-9536-482afafbee1a\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.256648 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68cff\" (UniqueName: \"kubernetes.io/projected/bd2bf44b-01e8-4236-9dda-90998dd75b88-kube-api-access-68cff\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.256743 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-config\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.256874 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd2bf44b-01e8-4236-9dda-90998dd75b88-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.256977 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bd2bf44b-01e8-4236-9dda-90998dd75b88-config-out\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.257019 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/bd2bf44b-01e8-4236-9dda-90998dd75b88-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.257060 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/bd2bf44b-01e8-4236-9dda-90998dd75b88-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.257083 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-web-config\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.257125 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.358605 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bd2bf44b-01e8-4236-9dda-90998dd75b88-config-out\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.358667 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/bd2bf44b-01e8-4236-9dda-90998dd75b88-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.358698 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/bd2bf44b-01e8-4236-9dda-90998dd75b88-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.358720 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-web-config\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.358745 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.358855 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bd2bf44b-01e8-4236-9dda-90998dd75b88-tls-assets\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.358874 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.358902 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/bd2bf44b-01e8-4236-9dda-90998dd75b88-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.358946 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-e34b9271-e06c-4d13-9536-482afafbee1a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e34b9271-e06c-4d13-9536-482afafbee1a\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.358969 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-68cff\" (UniqueName: \"kubernetes.io/projected/bd2bf44b-01e8-4236-9dda-90998dd75b88-kube-api-access-68cff\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.359007 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-config\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.359040 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd2bf44b-01e8-4236-9dda-90998dd75b88-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.360265 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd2bf44b-01e8-4236-9dda-90998dd75b88-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.361260 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/bd2bf44b-01e8-4236-9dda-90998dd75b88-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: E0322 00:34:32.361336 5116 secret.go:189] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Mar 22 00:34:32 crc kubenswrapper[5116]: E0322 00:34:32.361398 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-secret-default-prometheus-proxy-tls podName:bd2bf44b-01e8-4236-9dda-90998dd75b88 nodeName:}" failed. No retries permitted until 2026-03-22 00:34:32.861380517 +0000 UTC m=+1543.883681900 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "bd2bf44b-01e8-4236-9dda-90998dd75b88") : secret "default-prometheus-proxy-tls" not found Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.361866 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/bd2bf44b-01e8-4236-9dda-90998dd75b88-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.362342 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/bd2bf44b-01e8-4236-9dda-90998dd75b88-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.367373 5116 csi_attacher.go:373] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.367447 5116 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"pvc-e34b9271-e06c-4d13-9536-482afafbee1a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e34b9271-e06c-4d13-9536-482afafbee1a\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/68142e33650e2a4f12e38c949b077fedb6bc181702745383b76390572000844b/globalmount\"" pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.367801 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-config\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.369576 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bd2bf44b-01e8-4236-9dda-90998dd75b88-config-out\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.370475 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.378988 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bd2bf44b-01e8-4236-9dda-90998dd75b88-tls-assets\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.387545 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-web-config\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.390637 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-68cff\" (UniqueName: \"kubernetes.io/projected/bd2bf44b-01e8-4236-9dda-90998dd75b88-kube-api-access-68cff\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.422307 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pvc-e34b9271-e06c-4d13-9536-482afafbee1a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e34b9271-e06c-4d13-9536-482afafbee1a\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.866624 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: E0322 00:34:32.868809 5116 secret.go:189] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Mar 22 00:34:32 crc kubenswrapper[5116]: E0322 00:34:32.868895 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-secret-default-prometheus-proxy-tls podName:bd2bf44b-01e8-4236-9dda-90998dd75b88 nodeName:}" failed. No retries permitted until 2026-03-22 00:34:33.86887372 +0000 UTC m=+1544.891175133 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "bd2bf44b-01e8-4236-9dda-90998dd75b88") : secret "default-prometheus-proxy-tls" not found Mar 22 00:34:33 crc kubenswrapper[5116]: I0322 00:34:33.882152 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:33 crc kubenswrapper[5116]: I0322 00:34:33.888611 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:33 crc kubenswrapper[5116]: I0322 00:34:33.927828 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Mar 22 00:34:34 crc kubenswrapper[5116]: I0322 00:34:34.220834 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 22 00:34:34 crc kubenswrapper[5116]: W0322 00:34:34.224549 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd2bf44b_01e8_4236_9dda_90998dd75b88.slice/crio-8838301c1591fe5becdaadf4a2219f431f724bb4598fd181f625fe2fb932704e WatchSource:0}: Error finding container 8838301c1591fe5becdaadf4a2219f431f724bb4598fd181f625fe2fb932704e: Status 404 returned error can't find the container with id 8838301c1591fe5becdaadf4a2219f431f724bb4598fd181f625fe2fb932704e Mar 22 00:34:34 crc kubenswrapper[5116]: I0322 00:34:34.525940 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"bd2bf44b-01e8-4236-9dda-90998dd75b88","Type":"ContainerStarted","Data":"8838301c1591fe5becdaadf4a2219f431f724bb4598fd181f625fe2fb932704e"} Mar 22 00:34:36 crc kubenswrapper[5116]: I0322 00:34:36.698015 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:34:36 crc kubenswrapper[5116]: E0322 00:34:36.699079 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:34:38 crc kubenswrapper[5116]: I0322 00:34:38.580804 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"bd2bf44b-01e8-4236-9dda-90998dd75b88","Type":"ContainerStarted","Data":"927e86438d6ed8377c017c761d46dcc9030f2456729b3c91c60ced84d2b01f15"} Mar 22 00:34:41 crc kubenswrapper[5116]: I0322 00:34:41.860666 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/default-snmp-webhook-694dc457d5-77wwx"] Mar 22 00:34:41 crc kubenswrapper[5116]: I0322 00:34:41.871940 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-694dc457d5-77wwx" Mar 22 00:34:41 crc kubenswrapper[5116]: I0322 00:34:41.880266 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-694dc457d5-77wwx"] Mar 22 00:34:42 crc kubenswrapper[5116]: I0322 00:34:42.025635 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dpnx\" (UniqueName: \"kubernetes.io/projected/536dbd5d-337e-43fa-925a-e88d3be7da06-kube-api-access-8dpnx\") pod \"default-snmp-webhook-694dc457d5-77wwx\" (UID: \"536dbd5d-337e-43fa-925a-e88d3be7da06\") " pod="service-telemetry/default-snmp-webhook-694dc457d5-77wwx" Mar 22 00:34:42 crc kubenswrapper[5116]: I0322 00:34:42.127158 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8dpnx\" (UniqueName: \"kubernetes.io/projected/536dbd5d-337e-43fa-925a-e88d3be7da06-kube-api-access-8dpnx\") pod \"default-snmp-webhook-694dc457d5-77wwx\" (UID: \"536dbd5d-337e-43fa-925a-e88d3be7da06\") " pod="service-telemetry/default-snmp-webhook-694dc457d5-77wwx" Mar 22 00:34:42 crc kubenswrapper[5116]: I0322 00:34:42.155674 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dpnx\" (UniqueName: \"kubernetes.io/projected/536dbd5d-337e-43fa-925a-e88d3be7da06-kube-api-access-8dpnx\") pod \"default-snmp-webhook-694dc457d5-77wwx\" (UID: \"536dbd5d-337e-43fa-925a-e88d3be7da06\") " pod="service-telemetry/default-snmp-webhook-694dc457d5-77wwx" Mar 22 00:34:42 crc kubenswrapper[5116]: I0322 00:34:42.199657 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-694dc457d5-77wwx" Mar 22 00:34:42 crc kubenswrapper[5116]: I0322 00:34:42.463049 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-694dc457d5-77wwx"] Mar 22 00:34:42 crc kubenswrapper[5116]: I0322 00:34:42.477690 5116 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 22 00:34:42 crc kubenswrapper[5116]: I0322 00:34:42.618059 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-694dc457d5-77wwx" event={"ID":"536dbd5d-337e-43fa-925a-e88d3be7da06","Type":"ContainerStarted","Data":"21df424f5a1103eb9542e3aa79b2d9a068b666580c132c116d2be44b12c91aff"} Mar 22 00:34:44 crc kubenswrapper[5116]: I0322 00:34:44.632991 5116 generic.go:358] "Generic (PLEG): container finished" podID="bd2bf44b-01e8-4236-9dda-90998dd75b88" containerID="927e86438d6ed8377c017c761d46dcc9030f2456729b3c91c60ced84d2b01f15" exitCode=0 Mar 22 00:34:44 crc kubenswrapper[5116]: I0322 00:34:44.633194 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"bd2bf44b-01e8-4236-9dda-90998dd75b88","Type":"ContainerDied","Data":"927e86438d6ed8377c017c761d46dcc9030f2456729b3c91c60ced84d2b01f15"} Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.341013 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.380115 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.380318 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.383721 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"default-alertmanager-proxy-tls\"" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.384103 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"alertmanager-default-cluster-tls-config\"" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.384317 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"alertmanager-stf-dockercfg-b5vhx\"" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.384450 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"alertmanager-default-tls-assets-0\"" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.385015 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"alertmanager-default-generated\"" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.389398 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"alertmanager-default-web-config\"" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.503597 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-config-volume\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.503658 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.503694 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2bt6\" (UniqueName: \"kubernetes.io/projected/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-kube-api-access-c2bt6\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.503726 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.503762 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.503785 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-config-out\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.503800 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-tls-assets\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.503821 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-adb471bd-3392-4e99-8064-a0c701f964f8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-adb471bd-3392-4e99-8064-a0c701f964f8\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.503836 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-web-config\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.605213 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.605297 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.605333 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-config-out\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.605358 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-tls-assets\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.605386 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-adb471bd-3392-4e99-8064-a0c701f964f8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-adb471bd-3392-4e99-8064-a0c701f964f8\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.605413 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-web-config\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: E0322 00:34:45.605510 5116 secret.go:189] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 22 00:34:45 crc kubenswrapper[5116]: E0322 00:34:45.605607 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-secret-default-alertmanager-proxy-tls podName:6a4e1e85-870e-408e-a4dd-c5a7d7fcecae nodeName:}" failed. No retries permitted until 2026-03-22 00:34:46.105583246 +0000 UTC m=+1557.127884629 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "6a4e1e85-870e-408e-a4dd-c5a7d7fcecae") : secret "default-alertmanager-proxy-tls" not found Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.605440 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-config-volume\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.606459 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.606515 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c2bt6\" (UniqueName: \"kubernetes.io/projected/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-kube-api-access-c2bt6\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.610187 5116 csi_attacher.go:373] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.612137 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-config-out\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.612625 5116 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"pvc-adb471bd-3392-4e99-8064-a0c701f964f8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-adb471bd-3392-4e99-8064-a0c701f964f8\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1e6735ba0855894953e997216cd431bb4709665fadbca3e3e6ab15f305ef49eb/globalmount\"" pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.614062 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-web-config\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.615014 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.615822 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-config-volume\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.620213 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-tls-assets\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.623412 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.628273 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2bt6\" (UniqueName: \"kubernetes.io/projected/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-kube-api-access-c2bt6\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.641390 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pvc-adb471bd-3392-4e99-8064-a0c701f964f8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-adb471bd-3392-4e99-8064-a0c701f964f8\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:46 crc kubenswrapper[5116]: I0322 00:34:46.113146 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:46 crc kubenswrapper[5116]: E0322 00:34:46.113315 5116 secret.go:189] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 22 00:34:46 crc kubenswrapper[5116]: E0322 00:34:46.113517 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-secret-default-alertmanager-proxy-tls podName:6a4e1e85-870e-408e-a4dd-c5a7d7fcecae nodeName:}" failed. No retries permitted until 2026-03-22 00:34:47.113492141 +0000 UTC m=+1558.135793534 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "6a4e1e85-870e-408e-a4dd-c5a7d7fcecae") : secret "default-alertmanager-proxy-tls" not found Mar 22 00:34:47 crc kubenswrapper[5116]: I0322 00:34:47.129062 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:47 crc kubenswrapper[5116]: E0322 00:34:47.129279 5116 secret.go:189] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 22 00:34:47 crc kubenswrapper[5116]: E0322 00:34:47.129364 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-secret-default-alertmanager-proxy-tls podName:6a4e1e85-870e-408e-a4dd-c5a7d7fcecae nodeName:}" failed. No retries permitted until 2026-03-22 00:34:49.129346682 +0000 UTC m=+1560.151648055 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "6a4e1e85-870e-408e-a4dd-c5a7d7fcecae") : secret "default-alertmanager-proxy-tls" not found Mar 22 00:34:49 crc kubenswrapper[5116]: I0322 00:34:49.158803 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:49 crc kubenswrapper[5116]: I0322 00:34:49.166055 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:49 crc kubenswrapper[5116]: I0322 00:34:49.304068 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:51 crc kubenswrapper[5116]: I0322 00:34:51.033771 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 22 00:34:51 crc kubenswrapper[5116]: I0322 00:34:51.707752 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-694dc457d5-77wwx" event={"ID":"536dbd5d-337e-43fa-925a-e88d3be7da06","Type":"ContainerStarted","Data":"631acff58a9da0052a575a8d9aa4f4ea970f2b6070611466cd8f2e9d2f095c3c"} Mar 22 00:34:51 crc kubenswrapper[5116]: I0322 00:34:51.707927 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:34:51 crc kubenswrapper[5116]: E0322 00:34:51.708282 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:34:51 crc kubenswrapper[5116]: I0322 00:34:51.710965 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae","Type":"ContainerStarted","Data":"944a4f90bd33711769e78227464fa7814b61bddfb0732e2e2218d5afcadf8d6a"} Mar 22 00:34:51 crc kubenswrapper[5116]: I0322 00:34:51.726701 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-snmp-webhook-694dc457d5-77wwx" podStartSLOduration=2.550659586 podStartE2EDuration="10.726680097s" podCreationTimestamp="2026-03-22 00:34:41 +0000 UTC" firstStartedPulling="2026-03-22 00:34:42.47785989 +0000 UTC m=+1553.500161263" lastFinishedPulling="2026-03-22 00:34:50.653880401 +0000 UTC m=+1561.676181774" observedRunningTime="2026-03-22 00:34:51.721868745 +0000 UTC m=+1562.744170128" watchObservedRunningTime="2026-03-22 00:34:51.726680097 +0000 UTC m=+1562.748981490" Mar 22 00:34:53 crc kubenswrapper[5116]: I0322 00:34:53.729130 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae","Type":"ContainerStarted","Data":"2b1982fd22c4e01709284329d1904d970fb21c60fb9936f1b710c1c5e9e998af"} Mar 22 00:34:54 crc kubenswrapper[5116]: I0322 00:34:54.647374 5116 scope.go:117] "RemoveContainer" containerID="6aa0210430a18d2b96a0bdd3c189fd69455e4afb5753bc588ac349572da2555d" Mar 22 00:34:55 crc kubenswrapper[5116]: I0322 00:34:55.745662 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"bd2bf44b-01e8-4236-9dda-90998dd75b88","Type":"ContainerStarted","Data":"f7efbe447e97065fd04b575590d695c99b1492f6ba1e95418a796d203ccfeeae"} Mar 22 00:34:57 crc kubenswrapper[5116]: I0322 00:34:57.760609 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"bd2bf44b-01e8-4236-9dda-90998dd75b88","Type":"ContainerStarted","Data":"4a0fda02ce06bb91956db76ab16c54bf2e872d2e728df77278064c8c47c3417b"} Mar 22 00:34:58 crc kubenswrapper[5116]: I0322 00:34:58.696680 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l"] Mar 22 00:34:58 crc kubenswrapper[5116]: I0322 00:34:58.778953 5116 generic.go:358] "Generic (PLEG): container finished" podID="6a4e1e85-870e-408e-a4dd-c5a7d7fcecae" containerID="2b1982fd22c4e01709284329d1904d970fb21c60fb9936f1b710c1c5e9e998af" exitCode=0 Mar 22 00:34:58 crc kubenswrapper[5116]: I0322 00:34:58.956276 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae","Type":"ContainerDied","Data":"2b1982fd22c4e01709284329d1904d970fb21c60fb9936f1b710c1c5e9e998af"} Mar 22 00:34:58 crc kubenswrapper[5116]: I0322 00:34:58.956425 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:34:58 crc kubenswrapper[5116]: I0322 00:34:58.956470 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l"] Mar 22 00:34:58 crc kubenswrapper[5116]: I0322 00:34:58.959512 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"default-cloud1-coll-meter-sg-core-configmap\"" Mar 22 00:34:58 crc kubenswrapper[5116]: I0322 00:34:58.959800 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"default-cloud1-coll-meter-proxy-tls\"" Mar 22 00:34:58 crc kubenswrapper[5116]: I0322 00:34:58.959959 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"smart-gateway-session-secret\"" Mar 22 00:34:58 crc kubenswrapper[5116]: I0322 00:34:58.961723 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"smart-gateway-dockercfg-bd2n9\"" Mar 22 00:34:59 crc kubenswrapper[5116]: I0322 00:34:59.120410 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l\" (UID: \"8e2f0d9c-7207-4164-a27f-9efeba6e22bb\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:34:59 crc kubenswrapper[5116]: I0322 00:34:59.120510 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l\" (UID: \"8e2f0d9c-7207-4164-a27f-9efeba6e22bb\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:34:59 crc kubenswrapper[5116]: I0322 00:34:59.120563 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l\" (UID: \"8e2f0d9c-7207-4164-a27f-9efeba6e22bb\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:34:59 crc kubenswrapper[5116]: I0322 00:34:59.120602 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l\" (UID: \"8e2f0d9c-7207-4164-a27f-9efeba6e22bb\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:34:59 crc kubenswrapper[5116]: I0322 00:34:59.120625 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpjjd\" (UniqueName: \"kubernetes.io/projected/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-kube-api-access-cpjjd\") pod \"default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l\" (UID: \"8e2f0d9c-7207-4164-a27f-9efeba6e22bb\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:34:59 crc kubenswrapper[5116]: I0322 00:34:59.222401 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l\" (UID: \"8e2f0d9c-7207-4164-a27f-9efeba6e22bb\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:34:59 crc kubenswrapper[5116]: I0322 00:34:59.222481 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l\" (UID: \"8e2f0d9c-7207-4164-a27f-9efeba6e22bb\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:34:59 crc kubenswrapper[5116]: I0322 00:34:59.222524 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l\" (UID: \"8e2f0d9c-7207-4164-a27f-9efeba6e22bb\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:34:59 crc kubenswrapper[5116]: I0322 00:34:59.222554 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l\" (UID: \"8e2f0d9c-7207-4164-a27f-9efeba6e22bb\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:34:59 crc kubenswrapper[5116]: I0322 00:34:59.222577 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cpjjd\" (UniqueName: \"kubernetes.io/projected/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-kube-api-access-cpjjd\") pod \"default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l\" (UID: \"8e2f0d9c-7207-4164-a27f-9efeba6e22bb\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:34:59 crc kubenswrapper[5116]: E0322 00:34:59.223070 5116 secret.go:189] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Mar 22 00:34:59 crc kubenswrapper[5116]: E0322 00:34:59.223140 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-default-cloud1-coll-meter-proxy-tls podName:8e2f0d9c-7207-4164-a27f-9efeba6e22bb nodeName:}" failed. No retries permitted until 2026-03-22 00:34:59.723121943 +0000 UTC m=+1570.745423316 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" (UID: "8e2f0d9c-7207-4164-a27f-9efeba6e22bb") : secret "default-cloud1-coll-meter-proxy-tls" not found Mar 22 00:34:59 crc kubenswrapper[5116]: I0322 00:34:59.223833 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l\" (UID: \"8e2f0d9c-7207-4164-a27f-9efeba6e22bb\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:34:59 crc kubenswrapper[5116]: I0322 00:34:59.224482 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l\" (UID: \"8e2f0d9c-7207-4164-a27f-9efeba6e22bb\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:34:59 crc kubenswrapper[5116]: I0322 00:34:59.234669 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l\" (UID: \"8e2f0d9c-7207-4164-a27f-9efeba6e22bb\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:34:59 crc kubenswrapper[5116]: I0322 00:34:59.242473 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpjjd\" (UniqueName: \"kubernetes.io/projected/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-kube-api-access-cpjjd\") pod \"default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l\" (UID: \"8e2f0d9c-7207-4164-a27f-9efeba6e22bb\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:34:59 crc kubenswrapper[5116]: I0322 00:34:59.729863 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l\" (UID: \"8e2f0d9c-7207-4164-a27f-9efeba6e22bb\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:34:59 crc kubenswrapper[5116]: E0322 00:34:59.730032 5116 secret.go:189] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Mar 22 00:34:59 crc kubenswrapper[5116]: E0322 00:34:59.730126 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-default-cloud1-coll-meter-proxy-tls podName:8e2f0d9c-7207-4164-a27f-9efeba6e22bb nodeName:}" failed. No retries permitted until 2026-03-22 00:35:00.730102479 +0000 UTC m=+1571.752403872 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" (UID: "8e2f0d9c-7207-4164-a27f-9efeba6e22bb") : secret "default-cloud1-coll-meter-proxy-tls" not found Mar 22 00:35:00 crc kubenswrapper[5116]: I0322 00:35:00.552254 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xvfm2"] Mar 22 00:35:00 crc kubenswrapper[5116]: I0322 00:35:00.597192 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xvfm2"] Mar 22 00:35:00 crc kubenswrapper[5116]: I0322 00:35:00.597319 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xvfm2" Mar 22 00:35:00 crc kubenswrapper[5116]: I0322 00:35:00.743080 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l\" (UID: \"8e2f0d9c-7207-4164-a27f-9efeba6e22bb\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:35:00 crc kubenswrapper[5116]: I0322 00:35:00.743188 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsvs4\" (UniqueName: \"kubernetes.io/projected/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-kube-api-access-fsvs4\") pod \"community-operators-xvfm2\" (UID: \"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef\") " pod="openshift-marketplace/community-operators-xvfm2" Mar 22 00:35:00 crc kubenswrapper[5116]: I0322 00:35:00.743254 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-utilities\") pod \"community-operators-xvfm2\" (UID: \"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef\") " pod="openshift-marketplace/community-operators-xvfm2" Mar 22 00:35:00 crc kubenswrapper[5116]: I0322 00:35:00.743321 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-catalog-content\") pod \"community-operators-xvfm2\" (UID: \"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef\") " pod="openshift-marketplace/community-operators-xvfm2" Mar 22 00:35:00 crc kubenswrapper[5116]: I0322 00:35:00.797440 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l\" (UID: \"8e2f0d9c-7207-4164-a27f-9efeba6e22bb\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:35:00 crc kubenswrapper[5116]: I0322 00:35:00.844028 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-utilities\") pod \"community-operators-xvfm2\" (UID: \"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef\") " pod="openshift-marketplace/community-operators-xvfm2" Mar 22 00:35:00 crc kubenswrapper[5116]: I0322 00:35:00.844123 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-catalog-content\") pod \"community-operators-xvfm2\" (UID: \"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef\") " pod="openshift-marketplace/community-operators-xvfm2" Mar 22 00:35:00 crc kubenswrapper[5116]: I0322 00:35:00.844220 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsvs4\" (UniqueName: \"kubernetes.io/projected/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-kube-api-access-fsvs4\") pod \"community-operators-xvfm2\" (UID: \"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef\") " pod="openshift-marketplace/community-operators-xvfm2" Mar 22 00:35:00 crc kubenswrapper[5116]: I0322 00:35:00.844598 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-utilities\") pod \"community-operators-xvfm2\" (UID: \"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef\") " pod="openshift-marketplace/community-operators-xvfm2" Mar 22 00:35:00 crc kubenswrapper[5116]: I0322 00:35:00.844953 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-catalog-content\") pod \"community-operators-xvfm2\" (UID: \"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef\") " pod="openshift-marketplace/community-operators-xvfm2" Mar 22 00:35:00 crc kubenswrapper[5116]: I0322 00:35:00.861810 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsvs4\" (UniqueName: \"kubernetes.io/projected/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-kube-api-access-fsvs4\") pod \"community-operators-xvfm2\" (UID: \"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef\") " pod="openshift-marketplace/community-operators-xvfm2" Mar 22 00:35:00 crc kubenswrapper[5116]: I0322 00:35:00.930383 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xvfm2" Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.086556 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.401878 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q"] Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.426394 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q"] Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.426531 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.429673 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"default-cloud1-ceil-meter-proxy-tls\"" Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.430872 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"default-cloud1-ceil-meter-sg-core-configmap\"" Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.552433 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q\" (UID: \"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.552691 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q\" (UID: \"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.552809 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q\" (UID: \"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.552902 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m6fw\" (UniqueName: \"kubernetes.io/projected/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-kube-api-access-6m6fw\") pod \"default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q\" (UID: \"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.552947 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q\" (UID: \"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.654082 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q\" (UID: \"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.654179 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q\" (UID: \"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.654237 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6m6fw\" (UniqueName: \"kubernetes.io/projected/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-kube-api-access-6m6fw\") pod \"default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q\" (UID: \"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.654265 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q\" (UID: \"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.654353 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q\" (UID: \"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:01 crc kubenswrapper[5116]: E0322 00:35:01.654482 5116 secret.go:189] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 22 00:35:01 crc kubenswrapper[5116]: E0322 00:35:01.654547 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-default-cloud1-ceil-meter-proxy-tls podName:c2a84a49-cbcf-41dd-8ed2-1df6cd7db259 nodeName:}" failed. No retries permitted until 2026-03-22 00:35:02.15452725 +0000 UTC m=+1573.176828633 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" (UID: "c2a84a49-cbcf-41dd-8ed2-1df6cd7db259") : secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.656117 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q\" (UID: \"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.658021 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q\" (UID: \"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.660864 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q\" (UID: \"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.670996 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m6fw\" (UniqueName: \"kubernetes.io/projected/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-kube-api-access-6m6fw\") pod \"default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q\" (UID: \"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:02 crc kubenswrapper[5116]: I0322 00:35:02.160847 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q\" (UID: \"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:02 crc kubenswrapper[5116]: E0322 00:35:02.160996 5116 secret.go:189] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 22 00:35:02 crc kubenswrapper[5116]: E0322 00:35:02.161058 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-default-cloud1-ceil-meter-proxy-tls podName:c2a84a49-cbcf-41dd-8ed2-1df6cd7db259 nodeName:}" failed. No retries permitted until 2026-03-22 00:35:03.161041811 +0000 UTC m=+1574.183343184 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" (UID: "c2a84a49-cbcf-41dd-8ed2-1df6cd7db259") : secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 22 00:35:03 crc kubenswrapper[5116]: I0322 00:35:03.174182 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q\" (UID: \"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:03 crc kubenswrapper[5116]: I0322 00:35:03.188405 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q\" (UID: \"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:03 crc kubenswrapper[5116]: I0322 00:35:03.245287 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:04 crc kubenswrapper[5116]: I0322 00:35:04.763293 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l"] Mar 22 00:35:04 crc kubenswrapper[5116]: I0322 00:35:04.844443 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xvfm2"] Mar 22 00:35:04 crc kubenswrapper[5116]: I0322 00:35:04.853595 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q"] Mar 22 00:35:04 crc kubenswrapper[5116]: I0322 00:35:04.935813 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5"] Mar 22 00:35:04 crc kubenswrapper[5116]: I0322 00:35:04.944869 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5"] Mar 22 00:35:04 crc kubenswrapper[5116]: I0322 00:35:04.945073 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:04 crc kubenswrapper[5116]: I0322 00:35:04.947215 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"default-cloud1-sens-meter-proxy-tls\"" Mar 22 00:35:04 crc kubenswrapper[5116]: I0322 00:35:04.947836 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"default-cloud1-sens-meter-sg-core-configmap\"" Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.025209 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/1d6d17f0-5973-421e-9838-1a6195ca1731-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5\" (UID: \"1d6d17f0-5973-421e-9838-1a6195ca1731\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.025334 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/1d6d17f0-5973-421e-9838-1a6195ca1731-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5\" (UID: \"1d6d17f0-5973-421e-9838-1a6195ca1731\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.025367 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tb5p\" (UniqueName: \"kubernetes.io/projected/1d6d17f0-5973-421e-9838-1a6195ca1731-kube-api-access-8tb5p\") pod \"default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5\" (UID: \"1d6d17f0-5973-421e-9838-1a6195ca1731\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.025390 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/1d6d17f0-5973-421e-9838-1a6195ca1731-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5\" (UID: \"1d6d17f0-5973-421e-9838-1a6195ca1731\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.025501 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d6d17f0-5973-421e-9838-1a6195ca1731-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5\" (UID: \"1d6d17f0-5973-421e-9838-1a6195ca1731\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.126531 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/1d6d17f0-5973-421e-9838-1a6195ca1731-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5\" (UID: \"1d6d17f0-5973-421e-9838-1a6195ca1731\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.126583 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8tb5p\" (UniqueName: \"kubernetes.io/projected/1d6d17f0-5973-421e-9838-1a6195ca1731-kube-api-access-8tb5p\") pod \"default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5\" (UID: \"1d6d17f0-5973-421e-9838-1a6195ca1731\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.126613 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/1d6d17f0-5973-421e-9838-1a6195ca1731-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5\" (UID: \"1d6d17f0-5973-421e-9838-1a6195ca1731\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.126659 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d6d17f0-5973-421e-9838-1a6195ca1731-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5\" (UID: \"1d6d17f0-5973-421e-9838-1a6195ca1731\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.126701 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/1d6d17f0-5973-421e-9838-1a6195ca1731-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5\" (UID: \"1d6d17f0-5973-421e-9838-1a6195ca1731\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:05 crc kubenswrapper[5116]: E0322 00:35:05.126868 5116 secret.go:189] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.127061 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/1d6d17f0-5973-421e-9838-1a6195ca1731-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5\" (UID: \"1d6d17f0-5973-421e-9838-1a6195ca1731\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:05 crc kubenswrapper[5116]: E0322 00:35:05.127143 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d6d17f0-5973-421e-9838-1a6195ca1731-default-cloud1-sens-meter-proxy-tls podName:1d6d17f0-5973-421e-9838-1a6195ca1731 nodeName:}" failed. No retries permitted until 2026-03-22 00:35:05.626959406 +0000 UTC m=+1576.649260779 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/1d6d17f0-5973-421e-9838-1a6195ca1731-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" (UID: "1d6d17f0-5973-421e-9838-1a6195ca1731") : secret "default-cloud1-sens-meter-proxy-tls" not found Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.127412 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/1d6d17f0-5973-421e-9838-1a6195ca1731-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5\" (UID: \"1d6d17f0-5973-421e-9838-1a6195ca1731\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.150138 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tb5p\" (UniqueName: \"kubernetes.io/projected/1d6d17f0-5973-421e-9838-1a6195ca1731-kube-api-access-8tb5p\") pod \"default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5\" (UID: \"1d6d17f0-5973-421e-9838-1a6195ca1731\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.153806 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/1d6d17f0-5973-421e-9838-1a6195ca1731-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5\" (UID: \"1d6d17f0-5973-421e-9838-1a6195ca1731\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.635465 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d6d17f0-5973-421e-9838-1a6195ca1731-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5\" (UID: \"1d6d17f0-5973-421e-9838-1a6195ca1731\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:05 crc kubenswrapper[5116]: E0322 00:35:05.635829 5116 secret.go:189] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Mar 22 00:35:05 crc kubenswrapper[5116]: E0322 00:35:05.635904 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d6d17f0-5973-421e-9838-1a6195ca1731-default-cloud1-sens-meter-proxy-tls podName:1d6d17f0-5973-421e-9838-1a6195ca1731 nodeName:}" failed. No retries permitted until 2026-03-22 00:35:06.635883592 +0000 UTC m=+1577.658184965 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/1d6d17f0-5973-421e-9838-1a6195ca1731-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" (UID: "1d6d17f0-5973-421e-9838-1a6195ca1731") : secret "default-cloud1-sens-meter-proxy-tls" not found Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.849977 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" event={"ID":"8e2f0d9c-7207-4164-a27f-9efeba6e22bb","Type":"ContainerStarted","Data":"73af41f1b34db2aa72ddcf2cdfb268e5f5d448440f5f21782adb05c23ebbc7bd"} Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.851302 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvfm2" event={"ID":"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef","Type":"ContainerStarted","Data":"7100f58fc42898dd88286865a83e861cb4a7d901326e4759acb51b3639cfe743"} Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.853451 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" event={"ID":"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259","Type":"ContainerStarted","Data":"2c957c4c3220bf67de939876cc1681bfa89a50c3e0747dd74d1db24c799c2691"} Mar 22 00:35:06 crc kubenswrapper[5116]: I0322 00:35:06.651590 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d6d17f0-5973-421e-9838-1a6195ca1731-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5\" (UID: \"1d6d17f0-5973-421e-9838-1a6195ca1731\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:06 crc kubenswrapper[5116]: I0322 00:35:06.678947 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d6d17f0-5973-421e-9838-1a6195ca1731-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5\" (UID: \"1d6d17f0-5973-421e-9838-1a6195ca1731\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:06 crc kubenswrapper[5116]: I0322 00:35:06.697670 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:35:06 crc kubenswrapper[5116]: E0322 00:35:06.697983 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:35:06 crc kubenswrapper[5116]: I0322 00:35:06.787103 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:06 crc kubenswrapper[5116]: I0322 00:35:06.863016 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" event={"ID":"8e2f0d9c-7207-4164-a27f-9efeba6e22bb","Type":"ContainerStarted","Data":"0582d2cf2a83fbb70d9e4f9ae0e4a04780481473ce5cdeec5fe7a1f24097d7c6"} Mar 22 00:35:06 crc kubenswrapper[5116]: I0322 00:35:06.865416 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"bd2bf44b-01e8-4236-9dda-90998dd75b88","Type":"ContainerStarted","Data":"aca9f6bcf759d14aba4e46449177bb7093c935bd7505cb25d6a81b43f1f882e0"} Mar 22 00:35:06 crc kubenswrapper[5116]: I0322 00:35:06.868872 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae","Type":"ContainerStarted","Data":"03c3b2e14062911452089c7edc2b2b8d00cd4d1ff77394dfb91ac45933c37e35"} Mar 22 00:35:06 crc kubenswrapper[5116]: I0322 00:35:06.870553 5116 generic.go:358] "Generic (PLEG): container finished" podID="42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef" containerID="dd4d6a7f7183a7d7bf2df9dd3a56d239ecc27e8ad7a7326e646d8f570dd97db9" exitCode=0 Mar 22 00:35:06 crc kubenswrapper[5116]: I0322 00:35:06.870677 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvfm2" event={"ID":"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef","Type":"ContainerDied","Data":"dd4d6a7f7183a7d7bf2df9dd3a56d239ecc27e8ad7a7326e646d8f570dd97db9"} Mar 22 00:35:06 crc kubenswrapper[5116]: I0322 00:35:06.873093 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" event={"ID":"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259","Type":"ContainerStarted","Data":"40c129001d4188023f7270fc28d50f2540d93840c59fe23071464b4a0b6fa30d"} Mar 22 00:35:06 crc kubenswrapper[5116]: I0322 00:35:06.889376 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-default-0" podStartSLOduration=5.204008863 podStartE2EDuration="35.88935983s" podCreationTimestamp="2026-03-22 00:34:31 +0000 UTC" firstStartedPulling="2026-03-22 00:34:34.227360926 +0000 UTC m=+1545.249662309" lastFinishedPulling="2026-03-22 00:35:04.912711903 +0000 UTC m=+1575.935013276" observedRunningTime="2026-03-22 00:35:06.886085717 +0000 UTC m=+1577.908387090" watchObservedRunningTime="2026-03-22 00:35:06.88935983 +0000 UTC m=+1577.911661203" Mar 22 00:35:07 crc kubenswrapper[5116]: I0322 00:35:07.611845 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5"] Mar 22 00:35:07 crc kubenswrapper[5116]: I0322 00:35:07.887824 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvfm2" event={"ID":"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef","Type":"ContainerStarted","Data":"c8454fe022e8d1847a832e3350b9f001ecee9cb0a1e9553e14a376192c1d2a73"} Mar 22 00:35:07 crc kubenswrapper[5116]: I0322 00:35:07.893722 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" event={"ID":"1d6d17f0-5973-421e-9838-1a6195ca1731","Type":"ContainerStarted","Data":"2cf5090c5ad46f07d8a3dede9837b9ea2d6b6ac8502c78847377024c2bce79c0"} Mar 22 00:35:07 crc kubenswrapper[5116]: I0322 00:35:07.897336 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" event={"ID":"8e2f0d9c-7207-4164-a27f-9efeba6e22bb","Type":"ContainerStarted","Data":"3a5ae25ea0289caa78d46760494110691c8efbbb0519cede3b2af027cdfa9f7e"} Mar 22 00:35:08 crc kubenswrapper[5116]: I0322 00:35:08.904295 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" event={"ID":"1d6d17f0-5973-421e-9838-1a6195ca1731","Type":"ContainerStarted","Data":"5673e97709ad82226f5fb15650c9f69362db1f34846064e348aa9875834ac86e"} Mar 22 00:35:08 crc kubenswrapper[5116]: I0322 00:35:08.907523 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae","Type":"ContainerStarted","Data":"df926cb85cf0c0aacf22badcb179b743b5a8a9906e6b9415868832aa05028a7c"} Mar 22 00:35:08 crc kubenswrapper[5116]: I0322 00:35:08.915367 5116 generic.go:358] "Generic (PLEG): container finished" podID="42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef" containerID="c8454fe022e8d1847a832e3350b9f001ecee9cb0a1e9553e14a376192c1d2a73" exitCode=0 Mar 22 00:35:08 crc kubenswrapper[5116]: I0322 00:35:08.915568 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvfm2" event={"ID":"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef","Type":"ContainerDied","Data":"c8454fe022e8d1847a832e3350b9f001ecee9cb0a1e9553e14a376192c1d2a73"} Mar 22 00:35:08 crc kubenswrapper[5116]: I0322 00:35:08.918313 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" event={"ID":"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259","Type":"ContainerStarted","Data":"75b36066b06077650c7f84b0f9fd6c64cfd5d0637d4554891130385df8f38042"} Mar 22 00:35:08 crc kubenswrapper[5116]: I0322 00:35:08.928243 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="service-telemetry/prometheus-default-0" Mar 22 00:35:09 crc kubenswrapper[5116]: I0322 00:35:09.928278 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvfm2" event={"ID":"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef","Type":"ContainerStarted","Data":"4700454666bd0a629b3d6fec2cc8d00c505815de21b47a5afdec2e2db73c721d"} Mar 22 00:35:10 crc kubenswrapper[5116]: I0322 00:35:10.956715 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xvfm2" podStartSLOduration=10.280415915 podStartE2EDuration="10.956692367s" podCreationTimestamp="2026-03-22 00:35:00 +0000 UTC" firstStartedPulling="2026-03-22 00:35:06.871110377 +0000 UTC m=+1577.893411760" lastFinishedPulling="2026-03-22 00:35:07.547386839 +0000 UTC m=+1578.569688212" observedRunningTime="2026-03-22 00:35:10.956382237 +0000 UTC m=+1581.978683630" watchObservedRunningTime="2026-03-22 00:35:10.956692367 +0000 UTC m=+1581.978993740" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.004305 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp"] Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.074327 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp"] Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.074517 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.076612 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"elasticsearch-es-cert\"" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.076686 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"default-cloud1-coll-event-sg-core-configmap\"" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.152125 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/f826e257-238f-4715-8010-f30569577292-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-664479c99b-v2bfp\" (UID: \"f826e257-238f-4715-8010-f30569577292\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.152205 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/f826e257-238f-4715-8010-f30569577292-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-664479c99b-v2bfp\" (UID: \"f826e257-238f-4715-8010-f30569577292\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.152350 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb6m8\" (UniqueName: \"kubernetes.io/projected/f826e257-238f-4715-8010-f30569577292-kube-api-access-wb6m8\") pod \"default-cloud1-coll-event-smartgateway-664479c99b-v2bfp\" (UID: \"f826e257-238f-4715-8010-f30569577292\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.152551 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/f826e257-238f-4715-8010-f30569577292-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-664479c99b-v2bfp\" (UID: \"f826e257-238f-4715-8010-f30569577292\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.253732 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/f826e257-238f-4715-8010-f30569577292-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-664479c99b-v2bfp\" (UID: \"f826e257-238f-4715-8010-f30569577292\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.253776 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/f826e257-238f-4715-8010-f30569577292-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-664479c99b-v2bfp\" (UID: \"f826e257-238f-4715-8010-f30569577292\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.253810 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wb6m8\" (UniqueName: \"kubernetes.io/projected/f826e257-238f-4715-8010-f30569577292-kube-api-access-wb6m8\") pod \"default-cloud1-coll-event-smartgateway-664479c99b-v2bfp\" (UID: \"f826e257-238f-4715-8010-f30569577292\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.253868 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/f826e257-238f-4715-8010-f30569577292-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-664479c99b-v2bfp\" (UID: \"f826e257-238f-4715-8010-f30569577292\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.254377 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/f826e257-238f-4715-8010-f30569577292-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-664479c99b-v2bfp\" (UID: \"f826e257-238f-4715-8010-f30569577292\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.254767 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/f826e257-238f-4715-8010-f30569577292-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-664479c99b-v2bfp\" (UID: \"f826e257-238f-4715-8010-f30569577292\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.274952 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb6m8\" (UniqueName: \"kubernetes.io/projected/f826e257-238f-4715-8010-f30569577292-kube-api-access-wb6m8\") pod \"default-cloud1-coll-event-smartgateway-664479c99b-v2bfp\" (UID: \"f826e257-238f-4715-8010-f30569577292\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.274975 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/f826e257-238f-4715-8010-f30569577292-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-664479c99b-v2bfp\" (UID: \"f826e257-238f-4715-8010-f30569577292\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.390599 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.667100 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6"] Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.771368 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.774499 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"default-cloud1-ceil-event-sg-core-configmap\"" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.778711 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkdvr\" (UniqueName: \"kubernetes.io/projected/874ff775-e791-4357-9719-a773b3a8e4d8-kube-api-access-xkdvr\") pod \"default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6\" (UID: \"874ff775-e791-4357-9719-a773b3a8e4d8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.778784 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/874ff775-e791-4357-9719-a773b3a8e4d8-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6\" (UID: \"874ff775-e791-4357-9719-a773b3a8e4d8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.778876 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/874ff775-e791-4357-9719-a773b3a8e4d8-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6\" (UID: \"874ff775-e791-4357-9719-a773b3a8e4d8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.779126 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/874ff775-e791-4357-9719-a773b3a8e4d8-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6\" (UID: \"874ff775-e791-4357-9719-a773b3a8e4d8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.779704 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6"] Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.881508 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xkdvr\" (UniqueName: \"kubernetes.io/projected/874ff775-e791-4357-9719-a773b3a8e4d8-kube-api-access-xkdvr\") pod \"default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6\" (UID: \"874ff775-e791-4357-9719-a773b3a8e4d8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.881638 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/874ff775-e791-4357-9719-a773b3a8e4d8-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6\" (UID: \"874ff775-e791-4357-9719-a773b3a8e4d8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.881855 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/874ff775-e791-4357-9719-a773b3a8e4d8-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6\" (UID: \"874ff775-e791-4357-9719-a773b3a8e4d8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.882129 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/874ff775-e791-4357-9719-a773b3a8e4d8-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6\" (UID: \"874ff775-e791-4357-9719-a773b3a8e4d8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.882642 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/874ff775-e791-4357-9719-a773b3a8e4d8-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6\" (UID: \"874ff775-e791-4357-9719-a773b3a8e4d8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.882722 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/874ff775-e791-4357-9719-a773b3a8e4d8-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6\" (UID: \"874ff775-e791-4357-9719-a773b3a8e4d8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.892058 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/874ff775-e791-4357-9719-a773b3a8e4d8-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6\" (UID: \"874ff775-e791-4357-9719-a773b3a8e4d8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.909431 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp"] Mar 22 00:35:13 crc kubenswrapper[5116]: W0322 00:35:13.913539 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf826e257_238f_4715_8010_f30569577292.slice/crio-c1ee8d7ee2913c76cfec4150a69db340cbf3c5180e1d2ae91c31b92ecf79fe8c WatchSource:0}: Error finding container c1ee8d7ee2913c76cfec4150a69db340cbf3c5180e1d2ae91c31b92ecf79fe8c: Status 404 returned error can't find the container with id c1ee8d7ee2913c76cfec4150a69db340cbf3c5180e1d2ae91c31b92ecf79fe8c Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.915005 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkdvr\" (UniqueName: \"kubernetes.io/projected/874ff775-e791-4357-9719-a773b3a8e4d8-kube-api-access-xkdvr\") pod \"default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6\" (UID: \"874ff775-e791-4357-9719-a773b3a8e4d8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.956611 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" event={"ID":"f826e257-238f-4715-8010-f30569577292","Type":"ContainerStarted","Data":"c1ee8d7ee2913c76cfec4150a69db340cbf3c5180e1d2ae91c31b92ecf79fe8c"} Mar 22 00:35:14 crc kubenswrapper[5116]: I0322 00:35:14.090028 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" Mar 22 00:35:18 crc kubenswrapper[5116]: I0322 00:35:18.929503 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/prometheus-default-0" Mar 22 00:35:18 crc kubenswrapper[5116]: I0322 00:35:18.973786 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/prometheus-default-0" Mar 22 00:35:19 crc kubenswrapper[5116]: I0322 00:35:19.030911 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/prometheus-default-0" Mar 22 00:35:19 crc kubenswrapper[5116]: I0322 00:35:19.736104 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:35:19 crc kubenswrapper[5116]: E0322 00:35:19.737066 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:35:20 crc kubenswrapper[5116]: I0322 00:35:20.004092 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6"] Mar 22 00:35:20 crc kubenswrapper[5116]: I0322 00:35:20.005720 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" event={"ID":"1d6d17f0-5973-421e-9838-1a6195ca1731","Type":"ContainerStarted","Data":"f1ed95bed5bc469f431a016c5abaeb431a702448728bbeb1b68788f3fcb5b440"} Mar 22 00:35:20 crc kubenswrapper[5116]: W0322 00:35:20.035974 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod874ff775_e791_4357_9719_a773b3a8e4d8.slice/crio-7cb09084032966ba6ef0160d72dd6b6cb95dc418dbf4acc1daa8c2f64ea51b18 WatchSource:0}: Error finding container 7cb09084032966ba6ef0160d72dd6b6cb95dc418dbf4acc1daa8c2f64ea51b18: Status 404 returned error can't find the container with id 7cb09084032966ba6ef0160d72dd6b6cb95dc418dbf4acc1daa8c2f64ea51b18 Mar 22 00:35:20 crc kubenswrapper[5116]: I0322 00:35:20.931542 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xvfm2" Mar 22 00:35:20 crc kubenswrapper[5116]: I0322 00:35:20.933318 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/community-operators-xvfm2" Mar 22 00:35:20 crc kubenswrapper[5116]: I0322 00:35:20.994053 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xvfm2" Mar 22 00:35:21 crc kubenswrapper[5116]: I0322 00:35:21.016025 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae","Type":"ContainerStarted","Data":"db2de75efae1f99672ef051de2d624c33479b2410198cea11e989e235ebdd662"} Mar 22 00:35:21 crc kubenswrapper[5116]: I0322 00:35:21.018791 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" event={"ID":"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259","Type":"ContainerStarted","Data":"45969a2287bcba0858b98439f73dd9672c718889ae381f10574f35d1bdcd5701"} Mar 22 00:35:21 crc kubenswrapper[5116]: I0322 00:35:21.022463 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" event={"ID":"874ff775-e791-4357-9719-a773b3a8e4d8","Type":"ContainerStarted","Data":"34a142d14fdb70f283397702f4a0e3569bc2c7c63099ed71bd5d66c03fb31242"} Mar 22 00:35:21 crc kubenswrapper[5116]: I0322 00:35:21.022507 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" event={"ID":"874ff775-e791-4357-9719-a773b3a8e4d8","Type":"ContainerStarted","Data":"bdc01262a2be3743c259f8f0138badb55dc7faf715baa5327a31fb3866566618"} Mar 22 00:35:21 crc kubenswrapper[5116]: I0322 00:35:21.022520 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" event={"ID":"874ff775-e791-4357-9719-a773b3a8e4d8","Type":"ContainerStarted","Data":"7cb09084032966ba6ef0160d72dd6b6cb95dc418dbf4acc1daa8c2f64ea51b18"} Mar 22 00:35:21 crc kubenswrapper[5116]: I0322 00:35:21.025107 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" event={"ID":"f826e257-238f-4715-8010-f30569577292","Type":"ContainerStarted","Data":"be22b1c1d5c7c321fa354ab59ee442ff30a573bafc539073c1dc06c846b02fa3"} Mar 22 00:35:21 crc kubenswrapper[5116]: I0322 00:35:21.025147 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" event={"ID":"f826e257-238f-4715-8010-f30569577292","Type":"ContainerStarted","Data":"cd977ceb1ee8ac9e9d8cfb561512ee335af388b7dd19acb55ddaec492281722a"} Mar 22 00:35:21 crc kubenswrapper[5116]: I0322 00:35:21.028343 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" event={"ID":"1d6d17f0-5973-421e-9838-1a6195ca1731","Type":"ContainerStarted","Data":"83271e63a11c7874f6bc12c5e162895af826ebec23720f3e7be635d16973683b"} Mar 22 00:35:21 crc kubenswrapper[5116]: I0322 00:35:21.032064 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" event={"ID":"8e2f0d9c-7207-4164-a27f-9efeba6e22bb","Type":"ContainerStarted","Data":"ffb5e995226a94702f1064e670a21e0075a72a8bc8c07e3cbf4f6f0329faf269"} Mar 22 00:35:21 crc kubenswrapper[5116]: I0322 00:35:21.054228 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/alertmanager-default-0" podStartSLOduration=16.285166252 podStartE2EDuration="37.054195896s" podCreationTimestamp="2026-03-22 00:34:44 +0000 UTC" firstStartedPulling="2026-03-22 00:34:58.957574602 +0000 UTC m=+1569.979875975" lastFinishedPulling="2026-03-22 00:35:19.726604246 +0000 UTC m=+1590.748905619" observedRunningTime="2026-03-22 00:35:21.044379769 +0000 UTC m=+1592.066681162" watchObservedRunningTime="2026-03-22 00:35:21.054195896 +0000 UTC m=+1592.076497309" Mar 22 00:35:21 crc kubenswrapper[5116]: I0322 00:35:21.082097 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xvfm2" Mar 22 00:35:21 crc kubenswrapper[5116]: I0322 00:35:21.083257 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" podStartSLOduration=2.832020778 podStartE2EDuration="9.083218475s" podCreationTimestamp="2026-03-22 00:35:12 +0000 UTC" firstStartedPulling="2026-03-22 00:35:13.916429729 +0000 UTC m=+1584.938731102" lastFinishedPulling="2026-03-22 00:35:20.167627436 +0000 UTC m=+1591.189928799" observedRunningTime="2026-03-22 00:35:21.078753495 +0000 UTC m=+1592.101054888" watchObservedRunningTime="2026-03-22 00:35:21.083218475 +0000 UTC m=+1592.105519858" Mar 22 00:35:21 crc kubenswrapper[5116]: I0322 00:35:21.098117 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" podStartSLOduration=4.619487843 podStartE2EDuration="17.098095611s" podCreationTimestamp="2026-03-22 00:35:04 +0000 UTC" firstStartedPulling="2026-03-22 00:35:07.614314246 +0000 UTC m=+1578.636615619" lastFinishedPulling="2026-03-22 00:35:20.092922014 +0000 UTC m=+1591.115223387" observedRunningTime="2026-03-22 00:35:21.095268112 +0000 UTC m=+1592.117569495" watchObservedRunningTime="2026-03-22 00:35:21.098095611 +0000 UTC m=+1592.120396994" Mar 22 00:35:21 crc kubenswrapper[5116]: I0322 00:35:21.150130 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" podStartSLOduration=8.911133321 podStartE2EDuration="23.150104761s" podCreationTimestamp="2026-03-22 00:34:58 +0000 UTC" firstStartedPulling="2026-03-22 00:35:05.496958359 +0000 UTC m=+1576.519259732" lastFinishedPulling="2026-03-22 00:35:19.735929799 +0000 UTC m=+1590.758231172" observedRunningTime="2026-03-22 00:35:21.119780311 +0000 UTC m=+1592.142081724" watchObservedRunningTime="2026-03-22 00:35:21.150104761 +0000 UTC m=+1592.172406134" Mar 22 00:35:21 crc kubenswrapper[5116]: I0322 00:35:21.158091 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" podStartSLOduration=7.8463985659999995 podStartE2EDuration="8.158068521s" podCreationTimestamp="2026-03-22 00:35:13 +0000 UTC" firstStartedPulling="2026-03-22 00:35:20.041402301 +0000 UTC m=+1591.063703674" lastFinishedPulling="2026-03-22 00:35:20.353072256 +0000 UTC m=+1591.375373629" observedRunningTime="2026-03-22 00:35:21.137355051 +0000 UTC m=+1592.159656424" watchObservedRunningTime="2026-03-22 00:35:21.158068521 +0000 UTC m=+1592.180369894" Mar 22 00:35:21 crc kubenswrapper[5116]: I0322 00:35:21.181629 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" podStartSLOduration=5.833926723 podStartE2EDuration="20.181452184s" podCreationTimestamp="2026-03-22 00:35:01 +0000 UTC" firstStartedPulling="2026-03-22 00:35:05.49793427 +0000 UTC m=+1576.520235643" lastFinishedPulling="2026-03-22 00:35:19.845459731 +0000 UTC m=+1590.867761104" observedRunningTime="2026-03-22 00:35:21.157297106 +0000 UTC m=+1592.179598479" watchObservedRunningTime="2026-03-22 00:35:21.181452184 +0000 UTC m=+1592.203753557" Mar 22 00:35:21 crc kubenswrapper[5116]: I0322 00:35:21.227895 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xvfm2"] Mar 22 00:35:23 crc kubenswrapper[5116]: I0322 00:35:23.045997 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xvfm2" podUID="42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef" containerName="registry-server" containerID="cri-o://4700454666bd0a629b3d6fec2cc8d00c505815de21b47a5afdec2e2db73c721d" gracePeriod=2 Mar 22 00:35:23 crc kubenswrapper[5116]: I0322 00:35:23.434811 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xvfm2" Mar 22 00:35:23 crc kubenswrapper[5116]: I0322 00:35:23.524643 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsvs4\" (UniqueName: \"kubernetes.io/projected/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-kube-api-access-fsvs4\") pod \"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef\" (UID: \"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef\") " Mar 22 00:35:23 crc kubenswrapper[5116]: I0322 00:35:23.524690 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-catalog-content\") pod \"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef\" (UID: \"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef\") " Mar 22 00:35:23 crc kubenswrapper[5116]: I0322 00:35:23.524761 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-utilities\") pod \"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef\" (UID: \"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef\") " Mar 22 00:35:23 crc kubenswrapper[5116]: I0322 00:35:23.525670 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-utilities" (OuterVolumeSpecName: "utilities") pod "42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef" (UID: "42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:35:23 crc kubenswrapper[5116]: I0322 00:35:23.532900 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-kube-api-access-fsvs4" (OuterVolumeSpecName: "kube-api-access-fsvs4") pod "42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef" (UID: "42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef"). InnerVolumeSpecName "kube-api-access-fsvs4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:35:23 crc kubenswrapper[5116]: I0322 00:35:23.573634 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef" (UID: "42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:35:23 crc kubenswrapper[5116]: I0322 00:35:23.626121 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fsvs4\" (UniqueName: \"kubernetes.io/projected/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-kube-api-access-fsvs4\") on node \"crc\" DevicePath \"\"" Mar 22 00:35:23 crc kubenswrapper[5116]: I0322 00:35:23.626177 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:35:23 crc kubenswrapper[5116]: I0322 00:35:23.626187 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:35:24 crc kubenswrapper[5116]: I0322 00:35:24.053936 5116 generic.go:358] "Generic (PLEG): container finished" podID="42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef" containerID="4700454666bd0a629b3d6fec2cc8d00c505815de21b47a5afdec2e2db73c721d" exitCode=0 Mar 22 00:35:24 crc kubenswrapper[5116]: I0322 00:35:24.054042 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xvfm2" Mar 22 00:35:24 crc kubenswrapper[5116]: I0322 00:35:24.054079 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvfm2" event={"ID":"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef","Type":"ContainerDied","Data":"4700454666bd0a629b3d6fec2cc8d00c505815de21b47a5afdec2e2db73c721d"} Mar 22 00:35:24 crc kubenswrapper[5116]: I0322 00:35:24.054132 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvfm2" event={"ID":"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef","Type":"ContainerDied","Data":"7100f58fc42898dd88286865a83e861cb4a7d901326e4759acb51b3639cfe743"} Mar 22 00:35:24 crc kubenswrapper[5116]: I0322 00:35:24.054156 5116 scope.go:117] "RemoveContainer" containerID="4700454666bd0a629b3d6fec2cc8d00c505815de21b47a5afdec2e2db73c721d" Mar 22 00:35:24 crc kubenswrapper[5116]: I0322 00:35:24.077209 5116 scope.go:117] "RemoveContainer" containerID="c8454fe022e8d1847a832e3350b9f001ecee9cb0a1e9553e14a376192c1d2a73" Mar 22 00:35:24 crc kubenswrapper[5116]: I0322 00:35:24.088331 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xvfm2"] Mar 22 00:35:24 crc kubenswrapper[5116]: I0322 00:35:24.100896 5116 scope.go:117] "RemoveContainer" containerID="dd4d6a7f7183a7d7bf2df9dd3a56d239ecc27e8ad7a7326e646d8f570dd97db9" Mar 22 00:35:24 crc kubenswrapper[5116]: I0322 00:35:24.114048 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xvfm2"] Mar 22 00:35:24 crc kubenswrapper[5116]: I0322 00:35:24.130747 5116 scope.go:117] "RemoveContainer" containerID="4700454666bd0a629b3d6fec2cc8d00c505815de21b47a5afdec2e2db73c721d" Mar 22 00:35:24 crc kubenswrapper[5116]: E0322 00:35:24.131123 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4700454666bd0a629b3d6fec2cc8d00c505815de21b47a5afdec2e2db73c721d\": container with ID starting with 4700454666bd0a629b3d6fec2cc8d00c505815de21b47a5afdec2e2db73c721d not found: ID does not exist" containerID="4700454666bd0a629b3d6fec2cc8d00c505815de21b47a5afdec2e2db73c721d" Mar 22 00:35:24 crc kubenswrapper[5116]: I0322 00:35:24.131181 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4700454666bd0a629b3d6fec2cc8d00c505815de21b47a5afdec2e2db73c721d"} err="failed to get container status \"4700454666bd0a629b3d6fec2cc8d00c505815de21b47a5afdec2e2db73c721d\": rpc error: code = NotFound desc = could not find container \"4700454666bd0a629b3d6fec2cc8d00c505815de21b47a5afdec2e2db73c721d\": container with ID starting with 4700454666bd0a629b3d6fec2cc8d00c505815de21b47a5afdec2e2db73c721d not found: ID does not exist" Mar 22 00:35:24 crc kubenswrapper[5116]: I0322 00:35:24.131208 5116 scope.go:117] "RemoveContainer" containerID="c8454fe022e8d1847a832e3350b9f001ecee9cb0a1e9553e14a376192c1d2a73" Mar 22 00:35:24 crc kubenswrapper[5116]: E0322 00:35:24.131450 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8454fe022e8d1847a832e3350b9f001ecee9cb0a1e9553e14a376192c1d2a73\": container with ID starting with c8454fe022e8d1847a832e3350b9f001ecee9cb0a1e9553e14a376192c1d2a73 not found: ID does not exist" containerID="c8454fe022e8d1847a832e3350b9f001ecee9cb0a1e9553e14a376192c1d2a73" Mar 22 00:35:24 crc kubenswrapper[5116]: I0322 00:35:24.131467 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8454fe022e8d1847a832e3350b9f001ecee9cb0a1e9553e14a376192c1d2a73"} err="failed to get container status \"c8454fe022e8d1847a832e3350b9f001ecee9cb0a1e9553e14a376192c1d2a73\": rpc error: code = NotFound desc = could not find container \"c8454fe022e8d1847a832e3350b9f001ecee9cb0a1e9553e14a376192c1d2a73\": container with ID starting with c8454fe022e8d1847a832e3350b9f001ecee9cb0a1e9553e14a376192c1d2a73 not found: ID does not exist" Mar 22 00:35:24 crc kubenswrapper[5116]: I0322 00:35:24.131479 5116 scope.go:117] "RemoveContainer" containerID="dd4d6a7f7183a7d7bf2df9dd3a56d239ecc27e8ad7a7326e646d8f570dd97db9" Mar 22 00:35:24 crc kubenswrapper[5116]: E0322 00:35:24.131630 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd4d6a7f7183a7d7bf2df9dd3a56d239ecc27e8ad7a7326e646d8f570dd97db9\": container with ID starting with dd4d6a7f7183a7d7bf2df9dd3a56d239ecc27e8ad7a7326e646d8f570dd97db9 not found: ID does not exist" containerID="dd4d6a7f7183a7d7bf2df9dd3a56d239ecc27e8ad7a7326e646d8f570dd97db9" Mar 22 00:35:24 crc kubenswrapper[5116]: I0322 00:35:24.131643 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd4d6a7f7183a7d7bf2df9dd3a56d239ecc27e8ad7a7326e646d8f570dd97db9"} err="failed to get container status \"dd4d6a7f7183a7d7bf2df9dd3a56d239ecc27e8ad7a7326e646d8f570dd97db9\": rpc error: code = NotFound desc = could not find container \"dd4d6a7f7183a7d7bf2df9dd3a56d239ecc27e8ad7a7326e646d8f570dd97db9\": container with ID starting with dd4d6a7f7183a7d7bf2df9dd3a56d239ecc27e8ad7a7326e646d8f570dd97db9 not found: ID does not exist" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.064261 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-55bf8d5cb-r7j5z"] Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.064685 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" podUID="e29f610d-8ef1-4992-857d-7b39f8694e44" containerName="default-interconnect" containerID="cri-o://13efdb7b09dbacea618ae6e5e926f99c6af7e489c0aef05534fdc0f2a748af1e" gracePeriod=30 Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.439583 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.467100 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-55bf8d5cb-9ppbp"] Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.469003 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e29f610d-8ef1-4992-857d-7b39f8694e44" containerName="default-interconnect" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.469050 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="e29f610d-8ef1-4992-857d-7b39f8694e44" containerName="default-interconnect" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.469080 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef" containerName="registry-server" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.469088 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef" containerName="registry-server" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.469100 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef" containerName="extract-utilities" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.469106 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef" containerName="extract-utilities" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.469142 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef" containerName="extract-content" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.469147 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef" containerName="extract-content" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.469269 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef" containerName="registry-server" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.469282 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="e29f610d-8ef1-4992-857d-7b39f8694e44" containerName="default-interconnect" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.473735 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.484120 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-55bf8d5cb-9ppbp"] Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.554809 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/e29f610d-8ef1-4992-857d-7b39f8694e44-sasl-config\") pod \"e29f610d-8ef1-4992-857d-7b39f8694e44\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.554981 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-openstack-ca\") pod \"e29f610d-8ef1-4992-857d-7b39f8694e44\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.555076 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-inter-router-credentials\") pod \"e29f610d-8ef1-4992-857d-7b39f8694e44\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.555102 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssw5p\" (UniqueName: \"kubernetes.io/projected/e29f610d-8ef1-4992-857d-7b39f8694e44-kube-api-access-ssw5p\") pod \"e29f610d-8ef1-4992-857d-7b39f8694e44\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.555131 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-inter-router-ca\") pod \"e29f610d-8ef1-4992-857d-7b39f8694e44\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.555191 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-sasl-users\") pod \"e29f610d-8ef1-4992-857d-7b39f8694e44\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.555348 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-openstack-credentials\") pod \"e29f610d-8ef1-4992-857d-7b39f8694e44\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.555469 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/8d54082f-1922-421a-85c0-77b5d30d7e68-default-interconnect-openstack-ca\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.555517 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/8d54082f-1922-421a-85c0-77b5d30d7e68-default-interconnect-openstack-credentials\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.555580 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5v4m\" (UniqueName: \"kubernetes.io/projected/8d54082f-1922-421a-85c0-77b5d30d7e68-kube-api-access-t5v4m\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.555608 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/8d54082f-1922-421a-85c0-77b5d30d7e68-default-interconnect-inter-router-credentials\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.555665 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/8d54082f-1922-421a-85c0-77b5d30d7e68-sasl-config\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.555720 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/8d54082f-1922-421a-85c0-77b5d30d7e68-default-interconnect-inter-router-ca\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.555804 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/8d54082f-1922-421a-85c0-77b5d30d7e68-sasl-users\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.558196 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e29f610d-8ef1-4992-857d-7b39f8694e44-sasl-config" (OuterVolumeSpecName: "sasl-config") pod "e29f610d-8ef1-4992-857d-7b39f8694e44" (UID: "e29f610d-8ef1-4992-857d-7b39f8694e44"). InnerVolumeSpecName "sasl-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.561473 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-inter-router-ca" (OuterVolumeSpecName: "default-interconnect-inter-router-ca") pod "e29f610d-8ef1-4992-857d-7b39f8694e44" (UID: "e29f610d-8ef1-4992-857d-7b39f8694e44"). InnerVolumeSpecName "default-interconnect-inter-router-ca". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.562307 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e29f610d-8ef1-4992-857d-7b39f8694e44-kube-api-access-ssw5p" (OuterVolumeSpecName: "kube-api-access-ssw5p") pod "e29f610d-8ef1-4992-857d-7b39f8694e44" (UID: "e29f610d-8ef1-4992-857d-7b39f8694e44"). InnerVolumeSpecName "kube-api-access-ssw5p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.569310 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-openstack-ca" (OuterVolumeSpecName: "default-interconnect-openstack-ca") pod "e29f610d-8ef1-4992-857d-7b39f8694e44" (UID: "e29f610d-8ef1-4992-857d-7b39f8694e44"). InnerVolumeSpecName "default-interconnect-openstack-ca". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.570140 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-openstack-credentials" (OuterVolumeSpecName: "default-interconnect-openstack-credentials") pod "e29f610d-8ef1-4992-857d-7b39f8694e44" (UID: "e29f610d-8ef1-4992-857d-7b39f8694e44"). InnerVolumeSpecName "default-interconnect-openstack-credentials". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.571301 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-sasl-users" (OuterVolumeSpecName: "sasl-users") pod "e29f610d-8ef1-4992-857d-7b39f8694e44" (UID: "e29f610d-8ef1-4992-857d-7b39f8694e44"). InnerVolumeSpecName "sasl-users". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.578347 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-inter-router-credentials" (OuterVolumeSpecName: "default-interconnect-inter-router-credentials") pod "e29f610d-8ef1-4992-857d-7b39f8694e44" (UID: "e29f610d-8ef1-4992-857d-7b39f8694e44"). InnerVolumeSpecName "default-interconnect-inter-router-credentials". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.657327 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5v4m\" (UniqueName: \"kubernetes.io/projected/8d54082f-1922-421a-85c0-77b5d30d7e68-kube-api-access-t5v4m\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.657369 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/8d54082f-1922-421a-85c0-77b5d30d7e68-default-interconnect-inter-router-credentials\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.657397 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/8d54082f-1922-421a-85c0-77b5d30d7e68-sasl-config\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.657988 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/8d54082f-1922-421a-85c0-77b5d30d7e68-default-interconnect-inter-router-ca\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.658225 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/8d54082f-1922-421a-85c0-77b5d30d7e68-sasl-users\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.658314 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/8d54082f-1922-421a-85c0-77b5d30d7e68-default-interconnect-openstack-ca\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.658381 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/8d54082f-1922-421a-85c0-77b5d30d7e68-default-interconnect-openstack-credentials\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.658491 5116 reconciler_common.go:299] "Volume detached for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-openstack-credentials\") on node \"crc\" DevicePath \"\"" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.658513 5116 reconciler_common.go:299] "Volume detached for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/e29f610d-8ef1-4992-857d-7b39f8694e44-sasl-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.658529 5116 reconciler_common.go:299] "Volume detached for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-openstack-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.658543 5116 reconciler_common.go:299] "Volume detached for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-inter-router-credentials\") on node \"crc\" DevicePath \"\"" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.658558 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ssw5p\" (UniqueName: \"kubernetes.io/projected/e29f610d-8ef1-4992-857d-7b39f8694e44-kube-api-access-ssw5p\") on node \"crc\" DevicePath \"\"" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.658572 5116 reconciler_common.go:299] "Volume detached for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-inter-router-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.658585 5116 reconciler_common.go:299] "Volume detached for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-sasl-users\") on node \"crc\" DevicePath \"\"" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.660643 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/8d54082f-1922-421a-85c0-77b5d30d7e68-default-interconnect-inter-router-credentials\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.660983 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/8d54082f-1922-421a-85c0-77b5d30d7e68-sasl-config\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.662674 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/8d54082f-1922-421a-85c0-77b5d30d7e68-default-interconnect-inter-router-ca\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.662895 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/8d54082f-1922-421a-85c0-77b5d30d7e68-default-interconnect-openstack-credentials\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.663969 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/8d54082f-1922-421a-85c0-77b5d30d7e68-sasl-users\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.666084 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/8d54082f-1922-421a-85c0-77b5d30d7e68-default-interconnect-openstack-ca\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.676995 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5v4m\" (UniqueName: \"kubernetes.io/projected/8d54082f-1922-421a-85c0-77b5d30d7e68-kube-api-access-t5v4m\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.709285 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef" path="/var/lib/kubelet/pods/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef/volumes" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.788549 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:26 crc kubenswrapper[5116]: E0322 00:35:26.062975 5116 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2a84a49_cbcf_41dd_8ed2_1df6cd7db259.slice/crio-conmon-75b36066b06077650c7f84b0f9fd6c64cfd5d0637d4554891130385df8f38042.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e2f0d9c_7207_4164_a27f_9efeba6e22bb.slice/crio-conmon-3a5ae25ea0289caa78d46760494110691c8efbbb0519cede3b2af027cdfa9f7e.scope\": RecentStats: unable to find data in memory cache]" Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.084875 5116 generic.go:358] "Generic (PLEG): container finished" podID="8e2f0d9c-7207-4164-a27f-9efeba6e22bb" containerID="3a5ae25ea0289caa78d46760494110691c8efbbb0519cede3b2af027cdfa9f7e" exitCode=0 Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.085022 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" event={"ID":"8e2f0d9c-7207-4164-a27f-9efeba6e22bb","Type":"ContainerDied","Data":"3a5ae25ea0289caa78d46760494110691c8efbbb0519cede3b2af027cdfa9f7e"} Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.085687 5116 scope.go:117] "RemoveContainer" containerID="3a5ae25ea0289caa78d46760494110691c8efbbb0519cede3b2af027cdfa9f7e" Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.091891 5116 generic.go:358] "Generic (PLEG): container finished" podID="e29f610d-8ef1-4992-857d-7b39f8694e44" containerID="13efdb7b09dbacea618ae6e5e926f99c6af7e489c0aef05534fdc0f2a748af1e" exitCode=0 Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.091947 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.092254 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" event={"ID":"e29f610d-8ef1-4992-857d-7b39f8694e44","Type":"ContainerDied","Data":"13efdb7b09dbacea618ae6e5e926f99c6af7e489c0aef05534fdc0f2a748af1e"} Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.092292 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" event={"ID":"e29f610d-8ef1-4992-857d-7b39f8694e44","Type":"ContainerDied","Data":"bf3991c3ece1f301b0d1fccfff351a4b6d180bae7f391fffa01dbd097f35a02d"} Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.092322 5116 scope.go:117] "RemoveContainer" containerID="13efdb7b09dbacea618ae6e5e926f99c6af7e489c0aef05534fdc0f2a748af1e" Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.102381 5116 generic.go:358] "Generic (PLEG): container finished" podID="c2a84a49-cbcf-41dd-8ed2-1df6cd7db259" containerID="75b36066b06077650c7f84b0f9fd6c64cfd5d0637d4554891130385df8f38042" exitCode=0 Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.102487 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" event={"ID":"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259","Type":"ContainerDied","Data":"75b36066b06077650c7f84b0f9fd6c64cfd5d0637d4554891130385df8f38042"} Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.110054 5116 scope.go:117] "RemoveContainer" containerID="75b36066b06077650c7f84b0f9fd6c64cfd5d0637d4554891130385df8f38042" Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.116959 5116 generic.go:358] "Generic (PLEG): container finished" podID="874ff775-e791-4357-9719-a773b3a8e4d8" containerID="bdc01262a2be3743c259f8f0138badb55dc7faf715baa5327a31fb3866566618" exitCode=0 Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.117044 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" event={"ID":"874ff775-e791-4357-9719-a773b3a8e4d8","Type":"ContainerDied","Data":"bdc01262a2be3743c259f8f0138badb55dc7faf715baa5327a31fb3866566618"} Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.117516 5116 scope.go:117] "RemoveContainer" containerID="bdc01262a2be3743c259f8f0138badb55dc7faf715baa5327a31fb3866566618" Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.138179 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-55bf8d5cb-r7j5z"] Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.146420 5116 generic.go:358] "Generic (PLEG): container finished" podID="f826e257-238f-4715-8010-f30569577292" containerID="cd977ceb1ee8ac9e9d8cfb561512ee335af388b7dd19acb55ddaec492281722a" exitCode=0 Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.146495 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" event={"ID":"f826e257-238f-4715-8010-f30569577292","Type":"ContainerDied","Data":"cd977ceb1ee8ac9e9d8cfb561512ee335af388b7dd19acb55ddaec492281722a"} Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.147065 5116 scope.go:117] "RemoveContainer" containerID="cd977ceb1ee8ac9e9d8cfb561512ee335af388b7dd19acb55ddaec492281722a" Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.148868 5116 scope.go:117] "RemoveContainer" containerID="13efdb7b09dbacea618ae6e5e926f99c6af7e489c0aef05534fdc0f2a748af1e" Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.148991 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["service-telemetry/default-interconnect-55bf8d5cb-r7j5z"] Mar 22 00:35:26 crc kubenswrapper[5116]: E0322 00:35:26.154255 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13efdb7b09dbacea618ae6e5e926f99c6af7e489c0aef05534fdc0f2a748af1e\": container with ID starting with 13efdb7b09dbacea618ae6e5e926f99c6af7e489c0aef05534fdc0f2a748af1e not found: ID does not exist" containerID="13efdb7b09dbacea618ae6e5e926f99c6af7e489c0aef05534fdc0f2a748af1e" Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.154305 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13efdb7b09dbacea618ae6e5e926f99c6af7e489c0aef05534fdc0f2a748af1e"} err="failed to get container status \"13efdb7b09dbacea618ae6e5e926f99c6af7e489c0aef05534fdc0f2a748af1e\": rpc error: code = NotFound desc = could not find container \"13efdb7b09dbacea618ae6e5e926f99c6af7e489c0aef05534fdc0f2a748af1e\": container with ID starting with 13efdb7b09dbacea618ae6e5e926f99c6af7e489c0aef05534fdc0f2a748af1e not found: ID does not exist" Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.260381 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-55bf8d5cb-9ppbp"] Mar 22 00:35:27 crc kubenswrapper[5116]: I0322 00:35:27.156188 5116 generic.go:358] "Generic (PLEG): container finished" podID="1d6d17f0-5973-421e-9838-1a6195ca1731" containerID="f1ed95bed5bc469f431a016c5abaeb431a702448728bbeb1b68788f3fcb5b440" exitCode=0 Mar 22 00:35:27 crc kubenswrapper[5116]: I0322 00:35:27.156305 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" event={"ID":"1d6d17f0-5973-421e-9838-1a6195ca1731","Type":"ContainerDied","Data":"f1ed95bed5bc469f431a016c5abaeb431a702448728bbeb1b68788f3fcb5b440"} Mar 22 00:35:27 crc kubenswrapper[5116]: I0322 00:35:27.157449 5116 scope.go:117] "RemoveContainer" containerID="f1ed95bed5bc469f431a016c5abaeb431a702448728bbeb1b68788f3fcb5b440" Mar 22 00:35:27 crc kubenswrapper[5116]: I0322 00:35:27.161482 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" event={"ID":"8e2f0d9c-7207-4164-a27f-9efeba6e22bb","Type":"ContainerStarted","Data":"b189b9b524a3cf072ed63771382222efd8f4226bb25c64118c701ad857050c50"} Mar 22 00:35:27 crc kubenswrapper[5116]: I0322 00:35:27.165994 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" event={"ID":"8d54082f-1922-421a-85c0-77b5d30d7e68","Type":"ContainerStarted","Data":"d57ef9bb49b3b11aaedb1cccd0272d3181c5da638914769f6096dfdd7d2ef4c4"} Mar 22 00:35:27 crc kubenswrapper[5116]: I0322 00:35:27.166277 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" event={"ID":"8d54082f-1922-421a-85c0-77b5d30d7e68","Type":"ContainerStarted","Data":"cb26f4a368af69a8b7a008081c9816094bb65715cc0ab7a96bac4b42191a080a"} Mar 22 00:35:27 crc kubenswrapper[5116]: I0322 00:35:27.170584 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" event={"ID":"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259","Type":"ContainerStarted","Data":"3125654e52f6752f2135b44ffeec172c9551d383e21ca203d1f3b90f217ad959"} Mar 22 00:35:27 crc kubenswrapper[5116]: I0322 00:35:27.179674 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" event={"ID":"874ff775-e791-4357-9719-a773b3a8e4d8","Type":"ContainerStarted","Data":"080b0859ad06c007a338b7db7695f50a6d107982b0b16937055ece2ed69d5f4c"} Mar 22 00:35:27 crc kubenswrapper[5116]: I0322 00:35:27.182456 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" event={"ID":"f826e257-238f-4715-8010-f30569577292","Type":"ContainerStarted","Data":"c2a3eac07cc2d9a8898b3dd56f24b336dc8fca4c9810a0f5ec2fb846869f71cd"} Mar 22 00:35:27 crc kubenswrapper[5116]: I0322 00:35:27.272185 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" podStartSLOduration=2.272151151 podStartE2EDuration="2.272151151s" podCreationTimestamp="2026-03-22 00:35:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:35:27.261613852 +0000 UTC m=+1598.283915245" watchObservedRunningTime="2026-03-22 00:35:27.272151151 +0000 UTC m=+1598.294452524" Mar 22 00:35:27 crc kubenswrapper[5116]: I0322 00:35:27.704584 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e29f610d-8ef1-4992-857d-7b39f8694e44" path="/var/lib/kubelet/pods/e29f610d-8ef1-4992-857d-7b39f8694e44/volumes" Mar 22 00:35:28 crc kubenswrapper[5116]: I0322 00:35:28.198772 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" event={"ID":"1d6d17f0-5973-421e-9838-1a6195ca1731","Type":"ContainerStarted","Data":"e7939b59ddf522ad2c8accf1d15d7c82f6217cc6ee65b4d9abe20e1ec7f9f366"} Mar 22 00:35:28 crc kubenswrapper[5116]: I0322 00:35:28.209190 5116 generic.go:358] "Generic (PLEG): container finished" podID="8e2f0d9c-7207-4164-a27f-9efeba6e22bb" containerID="b189b9b524a3cf072ed63771382222efd8f4226bb25c64118c701ad857050c50" exitCode=0 Mar 22 00:35:28 crc kubenswrapper[5116]: I0322 00:35:28.209419 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" event={"ID":"8e2f0d9c-7207-4164-a27f-9efeba6e22bb","Type":"ContainerDied","Data":"b189b9b524a3cf072ed63771382222efd8f4226bb25c64118c701ad857050c50"} Mar 22 00:35:28 crc kubenswrapper[5116]: I0322 00:35:28.209455 5116 scope.go:117] "RemoveContainer" containerID="3a5ae25ea0289caa78d46760494110691c8efbbb0519cede3b2af027cdfa9f7e" Mar 22 00:35:28 crc kubenswrapper[5116]: I0322 00:35:28.209925 5116 scope.go:117] "RemoveContainer" containerID="b189b9b524a3cf072ed63771382222efd8f4226bb25c64118c701ad857050c50" Mar 22 00:35:28 crc kubenswrapper[5116]: E0322 00:35:28.210161 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l_service-telemetry(8e2f0d9c-7207-4164-a27f-9efeba6e22bb)\"" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" podUID="8e2f0d9c-7207-4164-a27f-9efeba6e22bb" Mar 22 00:35:28 crc kubenswrapper[5116]: I0322 00:35:28.214781 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" event={"ID":"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259","Type":"ContainerDied","Data":"3125654e52f6752f2135b44ffeec172c9551d383e21ca203d1f3b90f217ad959"} Mar 22 00:35:28 crc kubenswrapper[5116]: I0322 00:35:28.215883 5116 scope.go:117] "RemoveContainer" containerID="3125654e52f6752f2135b44ffeec172c9551d383e21ca203d1f3b90f217ad959" Mar 22 00:35:28 crc kubenswrapper[5116]: E0322 00:35:28.216370 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q_service-telemetry(c2a84a49-cbcf-41dd-8ed2-1df6cd7db259)\"" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" podUID="c2a84a49-cbcf-41dd-8ed2-1df6cd7db259" Mar 22 00:35:28 crc kubenswrapper[5116]: I0322 00:35:28.212468 5116 generic.go:358] "Generic (PLEG): container finished" podID="c2a84a49-cbcf-41dd-8ed2-1df6cd7db259" containerID="3125654e52f6752f2135b44ffeec172c9551d383e21ca203d1f3b90f217ad959" exitCode=0 Mar 22 00:35:28 crc kubenswrapper[5116]: I0322 00:35:28.261050 5116 generic.go:358] "Generic (PLEG): container finished" podID="874ff775-e791-4357-9719-a773b3a8e4d8" containerID="080b0859ad06c007a338b7db7695f50a6d107982b0b16937055ece2ed69d5f4c" exitCode=0 Mar 22 00:35:28 crc kubenswrapper[5116]: I0322 00:35:28.261190 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" event={"ID":"874ff775-e791-4357-9719-a773b3a8e4d8","Type":"ContainerDied","Data":"080b0859ad06c007a338b7db7695f50a6d107982b0b16937055ece2ed69d5f4c"} Mar 22 00:35:28 crc kubenswrapper[5116]: I0322 00:35:28.261459 5116 scope.go:117] "RemoveContainer" containerID="75b36066b06077650c7f84b0f9fd6c64cfd5d0637d4554891130385df8f38042" Mar 22 00:35:28 crc kubenswrapper[5116]: I0322 00:35:28.262162 5116 scope.go:117] "RemoveContainer" containerID="080b0859ad06c007a338b7db7695f50a6d107982b0b16937055ece2ed69d5f4c" Mar 22 00:35:28 crc kubenswrapper[5116]: E0322 00:35:28.262471 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6_service-telemetry(874ff775-e791-4357-9719-a773b3a8e4d8)\"" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" podUID="874ff775-e791-4357-9719-a773b3a8e4d8" Mar 22 00:35:28 crc kubenswrapper[5116]: I0322 00:35:28.276188 5116 generic.go:358] "Generic (PLEG): container finished" podID="f826e257-238f-4715-8010-f30569577292" containerID="c2a3eac07cc2d9a8898b3dd56f24b336dc8fca4c9810a0f5ec2fb846869f71cd" exitCode=0 Mar 22 00:35:28 crc kubenswrapper[5116]: I0322 00:35:28.276991 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" event={"ID":"f826e257-238f-4715-8010-f30569577292","Type":"ContainerDied","Data":"c2a3eac07cc2d9a8898b3dd56f24b336dc8fca4c9810a0f5ec2fb846869f71cd"} Mar 22 00:35:28 crc kubenswrapper[5116]: I0322 00:35:28.277352 5116 scope.go:117] "RemoveContainer" containerID="c2a3eac07cc2d9a8898b3dd56f24b336dc8fca4c9810a0f5ec2fb846869f71cd" Mar 22 00:35:28 crc kubenswrapper[5116]: E0322 00:35:28.277596 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-event-smartgateway-664479c99b-v2bfp_service-telemetry(f826e257-238f-4715-8010-f30569577292)\"" pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" podUID="f826e257-238f-4715-8010-f30569577292" Mar 22 00:35:28 crc kubenswrapper[5116]: I0322 00:35:28.296516 5116 scope.go:117] "RemoveContainer" containerID="bdc01262a2be3743c259f8f0138badb55dc7faf715baa5327a31fb3866566618" Mar 22 00:35:28 crc kubenswrapper[5116]: I0322 00:35:28.343379 5116 scope.go:117] "RemoveContainer" containerID="cd977ceb1ee8ac9e9d8cfb561512ee335af388b7dd19acb55ddaec492281722a" Mar 22 00:35:29 crc kubenswrapper[5116]: I0322 00:35:29.292886 5116 generic.go:358] "Generic (PLEG): container finished" podID="1d6d17f0-5973-421e-9838-1a6195ca1731" containerID="e7939b59ddf522ad2c8accf1d15d7c82f6217cc6ee65b4d9abe20e1ec7f9f366" exitCode=0 Mar 22 00:35:29 crc kubenswrapper[5116]: I0322 00:35:29.293021 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" event={"ID":"1d6d17f0-5973-421e-9838-1a6195ca1731","Type":"ContainerDied","Data":"e7939b59ddf522ad2c8accf1d15d7c82f6217cc6ee65b4d9abe20e1ec7f9f366"} Mar 22 00:35:29 crc kubenswrapper[5116]: I0322 00:35:29.293147 5116 scope.go:117] "RemoveContainer" containerID="f1ed95bed5bc469f431a016c5abaeb431a702448728bbeb1b68788f3fcb5b440" Mar 22 00:35:29 crc kubenswrapper[5116]: I0322 00:35:29.293677 5116 scope.go:117] "RemoveContainer" containerID="e7939b59ddf522ad2c8accf1d15d7c82f6217cc6ee65b4d9abe20e1ec7f9f366" Mar 22 00:35:29 crc kubenswrapper[5116]: E0322 00:35:29.294061 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5_service-telemetry(1d6d17f0-5973-421e-9838-1a6195ca1731)\"" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" podUID="1d6d17f0-5973-421e-9838-1a6195ca1731" Mar 22 00:35:31 crc kubenswrapper[5116]: I0322 00:35:31.017320 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/qdr-test"] Mar 22 00:35:31 crc kubenswrapper[5116]: I0322 00:35:31.027597 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Mar 22 00:35:31 crc kubenswrapper[5116]: I0322 00:35:31.036795 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"default-interconnect-selfsigned\"" Mar 22 00:35:31 crc kubenswrapper[5116]: I0322 00:35:31.037014 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"qdr-test-config\"" Mar 22 00:35:31 crc kubenswrapper[5116]: I0322 00:35:31.044000 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Mar 22 00:35:31 crc kubenswrapper[5116]: I0322 00:35:31.162430 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/90868bad-f18e-4266-a1cf-e633cccb0c4b-qdr-test-config\") pod \"qdr-test\" (UID: \"90868bad-f18e-4266-a1cf-e633cccb0c4b\") " pod="service-telemetry/qdr-test" Mar 22 00:35:31 crc kubenswrapper[5116]: I0322 00:35:31.162755 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q79cf\" (UniqueName: \"kubernetes.io/projected/90868bad-f18e-4266-a1cf-e633cccb0c4b-kube-api-access-q79cf\") pod \"qdr-test\" (UID: \"90868bad-f18e-4266-a1cf-e633cccb0c4b\") " pod="service-telemetry/qdr-test" Mar 22 00:35:31 crc kubenswrapper[5116]: I0322 00:35:31.162798 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/90868bad-f18e-4266-a1cf-e633cccb0c4b-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"90868bad-f18e-4266-a1cf-e633cccb0c4b\") " pod="service-telemetry/qdr-test" Mar 22 00:35:31 crc kubenswrapper[5116]: I0322 00:35:31.264138 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q79cf\" (UniqueName: \"kubernetes.io/projected/90868bad-f18e-4266-a1cf-e633cccb0c4b-kube-api-access-q79cf\") pod \"qdr-test\" (UID: \"90868bad-f18e-4266-a1cf-e633cccb0c4b\") " pod="service-telemetry/qdr-test" Mar 22 00:35:31 crc kubenswrapper[5116]: I0322 00:35:31.264232 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/90868bad-f18e-4266-a1cf-e633cccb0c4b-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"90868bad-f18e-4266-a1cf-e633cccb0c4b\") " pod="service-telemetry/qdr-test" Mar 22 00:35:31 crc kubenswrapper[5116]: I0322 00:35:31.264306 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/90868bad-f18e-4266-a1cf-e633cccb0c4b-qdr-test-config\") pod \"qdr-test\" (UID: \"90868bad-f18e-4266-a1cf-e633cccb0c4b\") " pod="service-telemetry/qdr-test" Mar 22 00:35:31 crc kubenswrapper[5116]: I0322 00:35:31.265095 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/90868bad-f18e-4266-a1cf-e633cccb0c4b-qdr-test-config\") pod \"qdr-test\" (UID: \"90868bad-f18e-4266-a1cf-e633cccb0c4b\") " pod="service-telemetry/qdr-test" Mar 22 00:35:31 crc kubenswrapper[5116]: I0322 00:35:31.271928 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/90868bad-f18e-4266-a1cf-e633cccb0c4b-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"90868bad-f18e-4266-a1cf-e633cccb0c4b\") " pod="service-telemetry/qdr-test" Mar 22 00:35:31 crc kubenswrapper[5116]: I0322 00:35:31.279307 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q79cf\" (UniqueName: \"kubernetes.io/projected/90868bad-f18e-4266-a1cf-e633cccb0c4b-kube-api-access-q79cf\") pod \"qdr-test\" (UID: \"90868bad-f18e-4266-a1cf-e633cccb0c4b\") " pod="service-telemetry/qdr-test" Mar 22 00:35:31 crc kubenswrapper[5116]: I0322 00:35:31.351175 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Mar 22 00:35:31 crc kubenswrapper[5116]: I0322 00:35:31.562599 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Mar 22 00:35:31 crc kubenswrapper[5116]: W0322 00:35:31.572080 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90868bad_f18e_4266_a1cf_e633cccb0c4b.slice/crio-4308b575ec5b0569bef535090541e86d108cbd93df57b8268c7b07c6239f6491 WatchSource:0}: Error finding container 4308b575ec5b0569bef535090541e86d108cbd93df57b8268c7b07c6239f6491: Status 404 returned error can't find the container with id 4308b575ec5b0569bef535090541e86d108cbd93df57b8268c7b07c6239f6491 Mar 22 00:35:32 crc kubenswrapper[5116]: I0322 00:35:32.319730 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"90868bad-f18e-4266-a1cf-e633cccb0c4b","Type":"ContainerStarted","Data":"4308b575ec5b0569bef535090541e86d108cbd93df57b8268c7b07c6239f6491"} Mar 22 00:35:33 crc kubenswrapper[5116]: I0322 00:35:33.697036 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:35:33 crc kubenswrapper[5116]: E0322 00:35:33.697534 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.377702 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"90868bad-f18e-4266-a1cf-e633cccb0c4b","Type":"ContainerStarted","Data":"617a091a99cd55ca5a220878be801e2fb93ede016235ce0e811d04e68be20306"} Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.401765 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/qdr-test" podStartSLOduration=1.13611172 podStartE2EDuration="8.401742714s" podCreationTimestamp="2026-03-22 00:35:31 +0000 UTC" firstStartedPulling="2026-03-22 00:35:31.575572407 +0000 UTC m=+1602.597873780" lastFinishedPulling="2026-03-22 00:35:38.841203401 +0000 UTC m=+1609.863504774" observedRunningTime="2026-03-22 00:35:39.39456761 +0000 UTC m=+1610.416869003" watchObservedRunningTime="2026-03-22 00:35:39.401742714 +0000 UTC m=+1610.424044097" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.712856 5116 scope.go:117] "RemoveContainer" containerID="c2a3eac07cc2d9a8898b3dd56f24b336dc8fca4c9810a0f5ec2fb846869f71cd" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.713217 5116 scope.go:117] "RemoveContainer" containerID="b189b9b524a3cf072ed63771382222efd8f4226bb25c64118c701ad857050c50" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.713684 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/stf-smoketest-smoke1-fq6th"] Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.713724 5116 scope.go:117] "RemoveContainer" containerID="3125654e52f6752f2135b44ffeec172c9551d383e21ca203d1f3b90f217ad959" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.737257 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-fq6th"] Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.737538 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.740688 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"stf-smoketest-ceilometer-entrypoint-script\"" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.740930 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"stf-smoketest-ceilometer-publisher\"" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.740718 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"stf-smoketest-collectd-entrypoint-script\"" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.740753 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"stf-smoketest-sensubility-config\"" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.740688 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"stf-smoketest-healthcheck-log\"" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.741632 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"stf-smoketest-collectd-config\"" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.892410 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-collectd-config\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.892761 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb52l\" (UniqueName: \"kubernetes.io/projected/5256051f-7482-42a0-921d-0f3073e5f5fb-kube-api-access-vb52l\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.892802 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-healthcheck-log\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.892834 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.892871 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.892954 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-ceilometer-publisher\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.893016 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-sensubility-config\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.994884 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-healthcheck-log\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.994940 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.994976 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.995017 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-ceilometer-publisher\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.995126 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-sensubility-config\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.995460 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-collectd-config\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.995564 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vb52l\" (UniqueName: \"kubernetes.io/projected/5256051f-7482-42a0-921d-0f3073e5f5fb-kube-api-access-vb52l\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.996419 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-collectd-config\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.996532 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-ceilometer-publisher\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.996997 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.997444 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.998656 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-sensubility-config\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:40 crc kubenswrapper[5116]: I0322 00:35:40.000021 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-healthcheck-log\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:40 crc kubenswrapper[5116]: I0322 00:35:40.022924 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb52l\" (UniqueName: \"kubernetes.io/projected/5256051f-7482-42a0-921d-0f3073e5f5fb-kube-api-access-vb52l\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:40 crc kubenswrapper[5116]: I0322 00:35:40.068779 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:40 crc kubenswrapper[5116]: I0322 00:35:40.208026 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/curl"] Mar 22 00:35:40 crc kubenswrapper[5116]: I0322 00:35:40.212226 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 22 00:35:40 crc kubenswrapper[5116]: I0322 00:35:40.214785 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Mar 22 00:35:40 crc kubenswrapper[5116]: I0322 00:35:40.304489 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng9pg\" (UniqueName: \"kubernetes.io/projected/08818159-a850-4f98-8d52-3685351c96e4-kube-api-access-ng9pg\") pod \"curl\" (UID: \"08818159-a850-4f98-8d52-3685351c96e4\") " pod="service-telemetry/curl" Mar 22 00:35:40 crc kubenswrapper[5116]: I0322 00:35:40.388061 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" event={"ID":"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259","Type":"ContainerStarted","Data":"cc828f581887fc1fadfbecd7b268fbbeb23585aeb83436fe041c341f09db5496"} Mar 22 00:35:40 crc kubenswrapper[5116]: I0322 00:35:40.392662 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" event={"ID":"f826e257-238f-4715-8010-f30569577292","Type":"ContainerStarted","Data":"ade617f3251dcbda4dad6917f454d25bd36f6f4c3c96db162d9b2dd2dc267aa5"} Mar 22 00:35:40 crc kubenswrapper[5116]: I0322 00:35:40.395194 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" event={"ID":"8e2f0d9c-7207-4164-a27f-9efeba6e22bb","Type":"ContainerStarted","Data":"4681bd487cb8451d044892f561b12bb15a002abb5bdee0edd0e114e0905781af"} Mar 22 00:35:40 crc kubenswrapper[5116]: I0322 00:35:40.407396 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ng9pg\" (UniqueName: \"kubernetes.io/projected/08818159-a850-4f98-8d52-3685351c96e4-kube-api-access-ng9pg\") pod \"curl\" (UID: \"08818159-a850-4f98-8d52-3685351c96e4\") " pod="service-telemetry/curl" Mar 22 00:35:40 crc kubenswrapper[5116]: I0322 00:35:40.440582 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng9pg\" (UniqueName: \"kubernetes.io/projected/08818159-a850-4f98-8d52-3685351c96e4-kube-api-access-ng9pg\") pod \"curl\" (UID: \"08818159-a850-4f98-8d52-3685351c96e4\") " pod="service-telemetry/curl" Mar 22 00:35:40 crc kubenswrapper[5116]: I0322 00:35:40.541784 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 22 00:35:40 crc kubenswrapper[5116]: I0322 00:35:40.623444 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-fq6th"] Mar 22 00:35:40 crc kubenswrapper[5116]: I0322 00:35:40.990486 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Mar 22 00:35:41 crc kubenswrapper[5116]: I0322 00:35:41.404510 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-fq6th" event={"ID":"5256051f-7482-42a0-921d-0f3073e5f5fb","Type":"ContainerStarted","Data":"aceab4cbda2cdab69ea549cea0457aaa095384a3a7237572e46e8cc591a43c67"} Mar 22 00:35:41 crc kubenswrapper[5116]: I0322 00:35:41.406215 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"08818159-a850-4f98-8d52-3685351c96e4","Type":"ContainerStarted","Data":"e583baa57c9a16bc4d1316f91cda6f5dca1ec0de1b504d701235d1a37f93667f"} Mar 22 00:35:41 crc kubenswrapper[5116]: I0322 00:35:41.697880 5116 scope.go:117] "RemoveContainer" containerID="e7939b59ddf522ad2c8accf1d15d7c82f6217cc6ee65b4d9abe20e1ec7f9f366" Mar 22 00:35:41 crc kubenswrapper[5116]: I0322 00:35:41.697978 5116 scope.go:117] "RemoveContainer" containerID="080b0859ad06c007a338b7db7695f50a6d107982b0b16937055ece2ed69d5f4c" Mar 22 00:35:43 crc kubenswrapper[5116]: I0322 00:35:43.425127 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" event={"ID":"874ff775-e791-4357-9719-a773b3a8e4d8","Type":"ContainerStarted","Data":"410b6262c50815a5ed33819dc3bec1eaa9ed41ec7966e6426b7941c43d7c633e"} Mar 22 00:35:43 crc kubenswrapper[5116]: I0322 00:35:43.431444 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" event={"ID":"1d6d17f0-5973-421e-9838-1a6195ca1731","Type":"ContainerStarted","Data":"674c7c53f050eae528d2d61757a061f140572ded3bc868f29fd51fd161ab8cbe"} Mar 22 00:35:43 crc kubenswrapper[5116]: I0322 00:35:43.435401 5116 generic.go:358] "Generic (PLEG): container finished" podID="08818159-a850-4f98-8d52-3685351c96e4" containerID="5a89a5faf89fe511861b949dea89be2304c3b4ec8557f73e3c852a8b6c3274c7" exitCode=0 Mar 22 00:35:43 crc kubenswrapper[5116]: I0322 00:35:43.435578 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"08818159-a850-4f98-8d52-3685351c96e4","Type":"ContainerDied","Data":"5a89a5faf89fe511861b949dea89be2304c3b4ec8557f73e3c852a8b6c3274c7"} Mar 22 00:35:47 crc kubenswrapper[5116]: I0322 00:35:47.699065 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:35:47 crc kubenswrapper[5116]: E0322 00:35:47.700233 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:35:48 crc kubenswrapper[5116]: I0322 00:35:48.654348 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 22 00:35:48 crc kubenswrapper[5116]: I0322 00:35:48.825653 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_curl_08818159-a850-4f98-8d52-3685351c96e4/curl/0.log" Mar 22 00:35:48 crc kubenswrapper[5116]: I0322 00:35:48.842936 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng9pg\" (UniqueName: \"kubernetes.io/projected/08818159-a850-4f98-8d52-3685351c96e4-kube-api-access-ng9pg\") pod \"08818159-a850-4f98-8d52-3685351c96e4\" (UID: \"08818159-a850-4f98-8d52-3685351c96e4\") " Mar 22 00:35:48 crc kubenswrapper[5116]: I0322 00:35:48.853994 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08818159-a850-4f98-8d52-3685351c96e4-kube-api-access-ng9pg" (OuterVolumeSpecName: "kube-api-access-ng9pg") pod "08818159-a850-4f98-8d52-3685351c96e4" (UID: "08818159-a850-4f98-8d52-3685351c96e4"). InnerVolumeSpecName "kube-api-access-ng9pg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:35:48 crc kubenswrapper[5116]: I0322 00:35:48.944657 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ng9pg\" (UniqueName: \"kubernetes.io/projected/08818159-a850-4f98-8d52-3685351c96e4-kube-api-access-ng9pg\") on node \"crc\" DevicePath \"\"" Mar 22 00:35:49 crc kubenswrapper[5116]: I0322 00:35:49.091091 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-694dc457d5-77wwx_536dbd5d-337e-43fa-925a-e88d3be7da06/prometheus-webhook-snmp/0.log" Mar 22 00:35:49 crc kubenswrapper[5116]: I0322 00:35:49.487950 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"08818159-a850-4f98-8d52-3685351c96e4","Type":"ContainerDied","Data":"e583baa57c9a16bc4d1316f91cda6f5dca1ec0de1b504d701235d1a37f93667f"} Mar 22 00:35:49 crc kubenswrapper[5116]: I0322 00:35:49.487994 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e583baa57c9a16bc4d1316f91cda6f5dca1ec0de1b504d701235d1a37f93667f" Mar 22 00:35:49 crc kubenswrapper[5116]: I0322 00:35:49.488056 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 22 00:35:53 crc kubenswrapper[5116]: I0322 00:35:53.521689 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-fq6th" event={"ID":"5256051f-7482-42a0-921d-0f3073e5f5fb","Type":"ContainerStarted","Data":"e5cae1dfbc260e122eb67f28931ab2ad11120b5060aef39d786b1e78211663e3"} Mar 22 00:35:58 crc kubenswrapper[5116]: I0322 00:35:58.561273 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-fq6th" event={"ID":"5256051f-7482-42a0-921d-0f3073e5f5fb","Type":"ContainerStarted","Data":"2481a3fbab199fb083c39362d71c9c80752d96cb67c6ae685e405b92d8ea560d"} Mar 22 00:36:00 crc kubenswrapper[5116]: I0322 00:36:00.144725 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/stf-smoketest-smoke1-fq6th" podStartSLOduration=3.9207053800000002 podStartE2EDuration="21.144697112s" podCreationTimestamp="2026-03-22 00:35:39 +0000 UTC" firstStartedPulling="2026-03-22 00:35:40.646479028 +0000 UTC m=+1611.668780401" lastFinishedPulling="2026-03-22 00:35:57.87047076 +0000 UTC m=+1628.892772133" observedRunningTime="2026-03-22 00:35:58.586451716 +0000 UTC m=+1629.608753109" watchObservedRunningTime="2026-03-22 00:36:00.144697112 +0000 UTC m=+1631.166998525" Mar 22 00:36:00 crc kubenswrapper[5116]: I0322 00:36:00.154693 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29568996-kb9ll"] Mar 22 00:36:00 crc kubenswrapper[5116]: I0322 00:36:00.157609 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="08818159-a850-4f98-8d52-3685351c96e4" containerName="curl" Mar 22 00:36:00 crc kubenswrapper[5116]: I0322 00:36:00.157859 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="08818159-a850-4f98-8d52-3685351c96e4" containerName="curl" Mar 22 00:36:00 crc kubenswrapper[5116]: I0322 00:36:00.158402 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="08818159-a850-4f98-8d52-3685351c96e4" containerName="curl" Mar 22 00:36:00 crc kubenswrapper[5116]: I0322 00:36:00.166219 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568996-kb9ll"] Mar 22 00:36:00 crc kubenswrapper[5116]: I0322 00:36:00.166440 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568996-kb9ll" Mar 22 00:36:00 crc kubenswrapper[5116]: I0322 00:36:00.169572 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 22 00:36:00 crc kubenswrapper[5116]: I0322 00:36:00.170031 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 22 00:36:00 crc kubenswrapper[5116]: I0322 00:36:00.170382 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-zsw2q\"" Mar 22 00:36:00 crc kubenswrapper[5116]: I0322 00:36:00.338154 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zj4j\" (UniqueName: \"kubernetes.io/projected/0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a-kube-api-access-5zj4j\") pod \"auto-csr-approver-29568996-kb9ll\" (UID: \"0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a\") " pod="openshift-infra/auto-csr-approver-29568996-kb9ll" Mar 22 00:36:00 crc kubenswrapper[5116]: I0322 00:36:00.440129 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5zj4j\" (UniqueName: \"kubernetes.io/projected/0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a-kube-api-access-5zj4j\") pod \"auto-csr-approver-29568996-kb9ll\" (UID: \"0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a\") " pod="openshift-infra/auto-csr-approver-29568996-kb9ll" Mar 22 00:36:00 crc kubenswrapper[5116]: I0322 00:36:00.473227 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zj4j\" (UniqueName: \"kubernetes.io/projected/0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a-kube-api-access-5zj4j\") pod \"auto-csr-approver-29568996-kb9ll\" (UID: \"0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a\") " pod="openshift-infra/auto-csr-approver-29568996-kb9ll" Mar 22 00:36:00 crc kubenswrapper[5116]: I0322 00:36:00.500993 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568996-kb9ll" Mar 22 00:36:00 crc kubenswrapper[5116]: I0322 00:36:00.774271 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568996-kb9ll"] Mar 22 00:36:00 crc kubenswrapper[5116]: W0322 00:36:00.778377 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a2fcae9_bfc7_4f9f_8c49_bac76c3a652a.slice/crio-619d93ceb49d6f15b82900f39d8b1db1a0fc7f4d04996436b877c7d92977225a WatchSource:0}: Error finding container 619d93ceb49d6f15b82900f39d8b1db1a0fc7f4d04996436b877c7d92977225a: Status 404 returned error can't find the container with id 619d93ceb49d6f15b82900f39d8b1db1a0fc7f4d04996436b877c7d92977225a Mar 22 00:36:01 crc kubenswrapper[5116]: I0322 00:36:01.591373 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568996-kb9ll" event={"ID":"0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a","Type":"ContainerStarted","Data":"619d93ceb49d6f15b82900f39d8b1db1a0fc7f4d04996436b877c7d92977225a"} Mar 22 00:36:02 crc kubenswrapper[5116]: I0322 00:36:02.604369 5116 generic.go:358] "Generic (PLEG): container finished" podID="0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a" containerID="688a5bb9aa102b8d7fbc699af35c11b9bcfdddc98bc6964ddb5e59d34be83553" exitCode=0 Mar 22 00:36:02 crc kubenswrapper[5116]: I0322 00:36:02.604596 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568996-kb9ll" event={"ID":"0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a","Type":"ContainerDied","Data":"688a5bb9aa102b8d7fbc699af35c11b9bcfdddc98bc6964ddb5e59d34be83553"} Mar 22 00:36:02 crc kubenswrapper[5116]: I0322 00:36:02.697677 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:36:02 crc kubenswrapper[5116]: E0322 00:36:02.698076 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:36:03 crc kubenswrapper[5116]: I0322 00:36:03.948935 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568996-kb9ll" Mar 22 00:36:04 crc kubenswrapper[5116]: I0322 00:36:04.009002 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zj4j\" (UniqueName: \"kubernetes.io/projected/0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a-kube-api-access-5zj4j\") pod \"0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a\" (UID: \"0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a\") " Mar 22 00:36:04 crc kubenswrapper[5116]: I0322 00:36:04.020527 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a-kube-api-access-5zj4j" (OuterVolumeSpecName: "kube-api-access-5zj4j") pod "0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a" (UID: "0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a"). InnerVolumeSpecName "kube-api-access-5zj4j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:36:04 crc kubenswrapper[5116]: I0322 00:36:04.110918 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5zj4j\" (UniqueName: \"kubernetes.io/projected/0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a-kube-api-access-5zj4j\") on node \"crc\" DevicePath \"\"" Mar 22 00:36:04 crc kubenswrapper[5116]: I0322 00:36:04.633663 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568996-kb9ll" Mar 22 00:36:04 crc kubenswrapper[5116]: I0322 00:36:04.633710 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568996-kb9ll" event={"ID":"0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a","Type":"ContainerDied","Data":"619d93ceb49d6f15b82900f39d8b1db1a0fc7f4d04996436b877c7d92977225a"} Mar 22 00:36:04 crc kubenswrapper[5116]: I0322 00:36:04.633768 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="619d93ceb49d6f15b82900f39d8b1db1a0fc7f4d04996436b877c7d92977225a" Mar 22 00:36:05 crc kubenswrapper[5116]: I0322 00:36:05.041376 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29568990-vx25q"] Mar 22 00:36:05 crc kubenswrapper[5116]: I0322 00:36:05.052124 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29568990-vx25q"] Mar 22 00:36:05 crc kubenswrapper[5116]: I0322 00:36:05.714063 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="753649c7-f19a-4b90-a29a-2108a691e934" path="/var/lib/kubelet/pods/753649c7-f19a-4b90-a29a-2108a691e934/volumes" Mar 22 00:36:16 crc kubenswrapper[5116]: I0322 00:36:16.697090 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:36:16 crc kubenswrapper[5116]: E0322 00:36:16.699451 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:36:19 crc kubenswrapper[5116]: I0322 00:36:19.403747 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-694dc457d5-77wwx_536dbd5d-337e-43fa-925a-e88d3be7da06/prometheus-webhook-snmp/0.log" Mar 22 00:36:27 crc kubenswrapper[5116]: I0322 00:36:27.841596 5116 generic.go:358] "Generic (PLEG): container finished" podID="5256051f-7482-42a0-921d-0f3073e5f5fb" containerID="e5cae1dfbc260e122eb67f28931ab2ad11120b5060aef39d786b1e78211663e3" exitCode=0 Mar 22 00:36:27 crc kubenswrapper[5116]: I0322 00:36:27.841658 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-fq6th" event={"ID":"5256051f-7482-42a0-921d-0f3073e5f5fb","Type":"ContainerDied","Data":"e5cae1dfbc260e122eb67f28931ab2ad11120b5060aef39d786b1e78211663e3"} Mar 22 00:36:27 crc kubenswrapper[5116]: I0322 00:36:27.842770 5116 scope.go:117] "RemoveContainer" containerID="e5cae1dfbc260e122eb67f28931ab2ad11120b5060aef39d786b1e78211663e3" Mar 22 00:36:29 crc kubenswrapper[5116]: I0322 00:36:29.860665 5116 generic.go:358] "Generic (PLEG): container finished" podID="5256051f-7482-42a0-921d-0f3073e5f5fb" containerID="2481a3fbab199fb083c39362d71c9c80752d96cb67c6ae685e405b92d8ea560d" exitCode=0 Mar 22 00:36:29 crc kubenswrapper[5116]: I0322 00:36:29.860750 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-fq6th" event={"ID":"5256051f-7482-42a0-921d-0f3073e5f5fb","Type":"ContainerDied","Data":"2481a3fbab199fb083c39362d71c9c80752d96cb67c6ae685e405b92d8ea560d"} Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.153601 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.236241 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-collectd-config\") pod \"5256051f-7482-42a0-921d-0f3073e5f5fb\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.236337 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-ceilometer-publisher\") pod \"5256051f-7482-42a0-921d-0f3073e5f5fb\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.236471 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-ceilometer-entrypoint-script\") pod \"5256051f-7482-42a0-921d-0f3073e5f5fb\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.236509 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-healthcheck-log\") pod \"5256051f-7482-42a0-921d-0f3073e5f5fb\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.236541 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-collectd-entrypoint-script\") pod \"5256051f-7482-42a0-921d-0f3073e5f5fb\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.236637 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vb52l\" (UniqueName: \"kubernetes.io/projected/5256051f-7482-42a0-921d-0f3073e5f5fb-kube-api-access-vb52l\") pod \"5256051f-7482-42a0-921d-0f3073e5f5fb\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.236770 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-sensubility-config\") pod \"5256051f-7482-42a0-921d-0f3073e5f5fb\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.243628 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5256051f-7482-42a0-921d-0f3073e5f5fb-kube-api-access-vb52l" (OuterVolumeSpecName: "kube-api-access-vb52l") pod "5256051f-7482-42a0-921d-0f3073e5f5fb" (UID: "5256051f-7482-42a0-921d-0f3073e5f5fb"). InnerVolumeSpecName "kube-api-access-vb52l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.257386 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-sensubility-config" (OuterVolumeSpecName: "sensubility-config") pod "5256051f-7482-42a0-921d-0f3073e5f5fb" (UID: "5256051f-7482-42a0-921d-0f3073e5f5fb"). InnerVolumeSpecName "sensubility-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.258812 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-ceilometer-entrypoint-script" (OuterVolumeSpecName: "ceilometer-entrypoint-script") pod "5256051f-7482-42a0-921d-0f3073e5f5fb" (UID: "5256051f-7482-42a0-921d-0f3073e5f5fb"). InnerVolumeSpecName "ceilometer-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.259210 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-collectd-entrypoint-script" (OuterVolumeSpecName: "collectd-entrypoint-script") pod "5256051f-7482-42a0-921d-0f3073e5f5fb" (UID: "5256051f-7482-42a0-921d-0f3073e5f5fb"). InnerVolumeSpecName "collectd-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.260559 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-healthcheck-log" (OuterVolumeSpecName: "healthcheck-log") pod "5256051f-7482-42a0-921d-0f3073e5f5fb" (UID: "5256051f-7482-42a0-921d-0f3073e5f5fb"). InnerVolumeSpecName "healthcheck-log". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.270308 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-ceilometer-publisher" (OuterVolumeSpecName: "ceilometer-publisher") pod "5256051f-7482-42a0-921d-0f3073e5f5fb" (UID: "5256051f-7482-42a0-921d-0f3073e5f5fb"). InnerVolumeSpecName "ceilometer-publisher". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.271042 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-collectd-config" (OuterVolumeSpecName: "collectd-config") pod "5256051f-7482-42a0-921d-0f3073e5f5fb" (UID: "5256051f-7482-42a0-921d-0f3073e5f5fb"). InnerVolumeSpecName "collectd-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.338162 5116 reconciler_common.go:299] "Volume detached for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-sensubility-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.338238 5116 reconciler_common.go:299] "Volume detached for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-collectd-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.338252 5116 reconciler_common.go:299] "Volume detached for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-ceilometer-publisher\") on node \"crc\" DevicePath \"\"" Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.338267 5116 reconciler_common.go:299] "Volume detached for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-ceilometer-entrypoint-script\") on node \"crc\" DevicePath \"\"" Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.338281 5116 reconciler_common.go:299] "Volume detached for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-healthcheck-log\") on node \"crc\" DevicePath \"\"" Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.338294 5116 reconciler_common.go:299] "Volume detached for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-collectd-entrypoint-script\") on node \"crc\" DevicePath \"\"" Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.338306 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vb52l\" (UniqueName: \"kubernetes.io/projected/5256051f-7482-42a0-921d-0f3073e5f5fb-kube-api-access-vb52l\") on node \"crc\" DevicePath \"\"" Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.698450 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:36:31 crc kubenswrapper[5116]: E0322 00:36:31.698997 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.887038 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-fq6th" event={"ID":"5256051f-7482-42a0-921d-0f3073e5f5fb","Type":"ContainerDied","Data":"aceab4cbda2cdab69ea549cea0457aaa095384a3a7237572e46e8cc591a43c67"} Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.887090 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aceab4cbda2cdab69ea549cea0457aaa095384a3a7237572e46e8cc591a43c67" Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.887150 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:36:33 crc kubenswrapper[5116]: I0322 00:36:33.220096 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-fq6th_5256051f-7482-42a0-921d-0f3073e5f5fb/smoketest-collectd/0.log" Mar 22 00:36:33 crc kubenswrapper[5116]: I0322 00:36:33.493125 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-fq6th_5256051f-7482-42a0-921d-0f3073e5f5fb/smoketest-ceilometer/0.log" Mar 22 00:36:33 crc kubenswrapper[5116]: I0322 00:36:33.785998 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-interconnect-55bf8d5cb-9ppbp_8d54082f-1922-421a-85c0-77b5d30d7e68/default-interconnect/0.log" Mar 22 00:36:34 crc kubenswrapper[5116]: I0322 00:36:34.043000 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l_8e2f0d9c-7207-4164-a27f-9efeba6e22bb/bridge/2.log" Mar 22 00:36:34 crc kubenswrapper[5116]: I0322 00:36:34.308120 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l_8e2f0d9c-7207-4164-a27f-9efeba6e22bb/sg-core/0.log" Mar 22 00:36:34 crc kubenswrapper[5116]: I0322 00:36:34.597828 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-664479c99b-v2bfp_f826e257-238f-4715-8010-f30569577292/bridge/2.log" Mar 22 00:36:34 crc kubenswrapper[5116]: I0322 00:36:34.879671 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-664479c99b-v2bfp_f826e257-238f-4715-8010-f30569577292/sg-core/0.log" Mar 22 00:36:35 crc kubenswrapper[5116]: I0322 00:36:35.145434 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q_c2a84a49-cbcf-41dd-8ed2-1df6cd7db259/bridge/2.log" Mar 22 00:36:35 crc kubenswrapper[5116]: I0322 00:36:35.441540 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q_c2a84a49-cbcf-41dd-8ed2-1df6cd7db259/sg-core/0.log" Mar 22 00:36:35 crc kubenswrapper[5116]: I0322 00:36:35.757870 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6_874ff775-e791-4357-9719-a773b3a8e4d8/bridge/2.log" Mar 22 00:36:36 crc kubenswrapper[5116]: I0322 00:36:36.063565 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6_874ff775-e791-4357-9719-a773b3a8e4d8/sg-core/0.log" Mar 22 00:36:36 crc kubenswrapper[5116]: I0322 00:36:36.354544 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5_1d6d17f0-5973-421e-9838-1a6195ca1731/bridge/2.log" Mar 22 00:36:36 crc kubenswrapper[5116]: I0322 00:36:36.666782 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5_1d6d17f0-5973-421e-9838-1a6195ca1731/sg-core/0.log" Mar 22 00:36:39 crc kubenswrapper[5116]: I0322 00:36:39.949456 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-564975b589-lf9hg_5f8e2cff-6459-4aae-8cf6-a48587311f68/operator/0.log" Mar 22 00:36:40 crc kubenswrapper[5116]: I0322 00:36:40.248407 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-default-0_bd2bf44b-01e8-4236-9dda-90998dd75b88/prometheus/0.log" Mar 22 00:36:40 crc kubenswrapper[5116]: I0322 00:36:40.531882 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_elasticsearch-es-default-0_ccb103ab-2a74-44b8-b853-0da2e0b4a6b5/elasticsearch/0.log" Mar 22 00:36:40 crc kubenswrapper[5116]: I0322 00:36:40.859241 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-694dc457d5-77wwx_536dbd5d-337e-43fa-925a-e88d3be7da06/prometheus-webhook-snmp/0.log" Mar 22 00:36:41 crc kubenswrapper[5116]: I0322 00:36:41.205432 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_alertmanager-default-0_6a4e1e85-870e-408e-a4dd-c5a7d7fcecae/alertmanager/0.log" Mar 22 00:36:44 crc kubenswrapper[5116]: I0322 00:36:44.697533 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:36:44 crc kubenswrapper[5116]: E0322 00:36:44.699232 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:36:56 crc kubenswrapper[5116]: I0322 00:36:56.698424 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:36:56 crc kubenswrapper[5116]: E0322 00:36:56.699396 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:36:57 crc kubenswrapper[5116]: I0322 00:36:57.320767 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-55b77595c-67kjz_fcfe47c3-284e-4c8b-b35f-06a0499313a5/operator/0.log" Mar 22 00:36:57 crc kubenswrapper[5116]: I0322 00:36:57.796248 5116 scope.go:117] "RemoveContainer" containerID="85bd7d245d43142b219c3be5ac468eafb8ba3e6e6a34155393343c620d8b140b" Mar 22 00:37:00 crc kubenswrapper[5116]: I0322 00:37:00.902930 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-564975b589-lf9hg_5f8e2cff-6459-4aae-8cf6-a48587311f68/operator/0.log" Mar 22 00:37:01 crc kubenswrapper[5116]: I0322 00:37:01.200767 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_qdr-test_90868bad-f18e-4266-a1cf-e633cccb0c4b/qdr/0.log" Mar 22 00:37:11 crc kubenswrapper[5116]: I0322 00:37:11.697317 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:37:11 crc kubenswrapper[5116]: E0322 00:37:11.698484 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:37:23 crc kubenswrapper[5116]: I0322 00:37:23.697840 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:37:23 crc kubenswrapper[5116]: E0322 00:37:23.698869 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.200533 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mmn2b/must-gather-pxgv6"] Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.202005 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a" containerName="oc" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.202036 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a" containerName="oc" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.202054 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5256051f-7482-42a0-921d-0f3073e5f5fb" containerName="smoketest-collectd" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.202064 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="5256051f-7482-42a0-921d-0f3073e5f5fb" containerName="smoketest-collectd" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.202142 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5256051f-7482-42a0-921d-0f3073e5f5fb" containerName="smoketest-ceilometer" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.202157 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="5256051f-7482-42a0-921d-0f3073e5f5fb" containerName="smoketest-ceilometer" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.202370 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a" containerName="oc" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.202409 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="5256051f-7482-42a0-921d-0f3073e5f5fb" containerName="smoketest-collectd" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.202427 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="5256051f-7482-42a0-921d-0f3073e5f5fb" containerName="smoketest-ceilometer" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.208406 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mmn2b/must-gather-pxgv6" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.210223 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mmn2b\"/\"openshift-service-ca.crt\"" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.211218 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mmn2b\"/\"kube-root-ca.crt\"" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.216016 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mmn2b/must-gather-pxgv6"] Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.353495 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czwdw\" (UniqueName: \"kubernetes.io/projected/4fa843c6-f428-4c50-8fb6-58db4910b8d8-kube-api-access-czwdw\") pod \"must-gather-pxgv6\" (UID: \"4fa843c6-f428-4c50-8fb6-58db4910b8d8\") " pod="openshift-must-gather-mmn2b/must-gather-pxgv6" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.353799 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4fa843c6-f428-4c50-8fb6-58db4910b8d8-must-gather-output\") pod \"must-gather-pxgv6\" (UID: \"4fa843c6-f428-4c50-8fb6-58db4910b8d8\") " pod="openshift-must-gather-mmn2b/must-gather-pxgv6" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.455705 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-czwdw\" (UniqueName: \"kubernetes.io/projected/4fa843c6-f428-4c50-8fb6-58db4910b8d8-kube-api-access-czwdw\") pod \"must-gather-pxgv6\" (UID: \"4fa843c6-f428-4c50-8fb6-58db4910b8d8\") " pod="openshift-must-gather-mmn2b/must-gather-pxgv6" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.455799 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4fa843c6-f428-4c50-8fb6-58db4910b8d8-must-gather-output\") pod \"must-gather-pxgv6\" (UID: \"4fa843c6-f428-4c50-8fb6-58db4910b8d8\") " pod="openshift-must-gather-mmn2b/must-gather-pxgv6" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.456204 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4fa843c6-f428-4c50-8fb6-58db4910b8d8-must-gather-output\") pod \"must-gather-pxgv6\" (UID: \"4fa843c6-f428-4c50-8fb6-58db4910b8d8\") " pod="openshift-must-gather-mmn2b/must-gather-pxgv6" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.478455 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-czwdw\" (UniqueName: \"kubernetes.io/projected/4fa843c6-f428-4c50-8fb6-58db4910b8d8-kube-api-access-czwdw\") pod \"must-gather-pxgv6\" (UID: \"4fa843c6-f428-4c50-8fb6-58db4910b8d8\") " pod="openshift-must-gather-mmn2b/must-gather-pxgv6" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.540635 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mmn2b/must-gather-pxgv6" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.864053 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mmn2b/must-gather-pxgv6"] Mar 22 00:37:27 crc kubenswrapper[5116]: I0322 00:37:27.521762 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mmn2b/must-gather-pxgv6" event={"ID":"4fa843c6-f428-4c50-8fb6-58db4910b8d8","Type":"ContainerStarted","Data":"046ccfde5254025be74845f8d01e12de4bae01aa5036280050a66768e75af1f8"} Mar 22 00:37:33 crc kubenswrapper[5116]: I0322 00:37:33.576571 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mmn2b/must-gather-pxgv6" event={"ID":"4fa843c6-f428-4c50-8fb6-58db4910b8d8","Type":"ContainerStarted","Data":"c58c0ca6ac202eff124ef375a1ba501807cfd8934d46c29180611db4363c333e"} Mar 22 00:37:33 crc kubenswrapper[5116]: I0322 00:37:33.577320 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mmn2b/must-gather-pxgv6" event={"ID":"4fa843c6-f428-4c50-8fb6-58db4910b8d8","Type":"ContainerStarted","Data":"d73ca44ab765dc4012a9020a414c305067b5524903bd0e1b9b94504288d3f864"} Mar 22 00:37:33 crc kubenswrapper[5116]: I0322 00:37:33.595494 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mmn2b/must-gather-pxgv6" podStartSLOduration=1.621485078 podStartE2EDuration="7.595479431s" podCreationTimestamp="2026-03-22 00:37:26 +0000 UTC" firstStartedPulling="2026-03-22 00:37:26.875948609 +0000 UTC m=+1717.898249982" lastFinishedPulling="2026-03-22 00:37:32.849942972 +0000 UTC m=+1723.872244335" observedRunningTime="2026-03-22 00:37:33.594636125 +0000 UTC m=+1724.616937498" watchObservedRunningTime="2026-03-22 00:37:33.595479431 +0000 UTC m=+1724.617780804" Mar 22 00:37:34 crc kubenswrapper[5116]: I0322 00:37:34.697302 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:37:34 crc kubenswrapper[5116]: E0322 00:37:34.697840 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:37:47 crc kubenswrapper[5116]: I0322 00:37:47.697057 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:37:47 crc kubenswrapper[5116]: E0322 00:37:47.697935 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:37:57 crc kubenswrapper[5116]: I0322 00:37:57.932713 5116 scope.go:117] "RemoveContainer" containerID="9efa61524ace63b51ff35c7e96502efaaf7a080de1173648c97977985912b667" Mar 22 00:37:58 crc kubenswrapper[5116]: I0322 00:37:58.008256 5116 scope.go:117] "RemoveContainer" containerID="4106dafe916c442ff6a6c84ef7c764001d5f6257e8b17fde82563cb3dcaa24f7" Mar 22 00:37:58 crc kubenswrapper[5116]: I0322 00:37:58.084711 5116 scope.go:117] "RemoveContainer" containerID="095ee6b2ca6fc427f5a465f41b63b9951a947f316d689c28f653b52df20fd554" Mar 22 00:37:58 crc kubenswrapper[5116]: I0322 00:37:58.174536 5116 scope.go:117] "RemoveContainer" containerID="2a6933fce9cc74d356d1d5a231d547fdb2259fd5ca76515682cf2f100750a7ff" Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.132786 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-4fdg5"] Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.140457 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-4fdg5" Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.149297 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-4fdg5"] Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.224390 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29568998-4b5vm"] Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.229027 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568998-4b5vm" Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.232052 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.232117 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-zsw2q\"" Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.232409 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.236002 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568998-4b5vm"] Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.293329 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qsrm\" (UniqueName: \"kubernetes.io/projected/7926ef86-b333-4a8b-98ca-7a71240e7702-kube-api-access-2qsrm\") pod \"auto-csr-approver-29568998-4b5vm\" (UID: \"7926ef86-b333-4a8b-98ca-7a71240e7702\") " pod="openshift-infra/auto-csr-approver-29568998-4b5vm" Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.293436 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t79vx\" (UniqueName: \"kubernetes.io/projected/1cb75cdb-9b93-44dd-9523-50b2c6f734d2-kube-api-access-t79vx\") pod \"infrawatch-operators-4fdg5\" (UID: \"1cb75cdb-9b93-44dd-9523-50b2c6f734d2\") " pod="service-telemetry/infrawatch-operators-4fdg5" Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.394453 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t79vx\" (UniqueName: \"kubernetes.io/projected/1cb75cdb-9b93-44dd-9523-50b2c6f734d2-kube-api-access-t79vx\") pod \"infrawatch-operators-4fdg5\" (UID: \"1cb75cdb-9b93-44dd-9523-50b2c6f734d2\") " pod="service-telemetry/infrawatch-operators-4fdg5" Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.394614 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2qsrm\" (UniqueName: \"kubernetes.io/projected/7926ef86-b333-4a8b-98ca-7a71240e7702-kube-api-access-2qsrm\") pod \"auto-csr-approver-29568998-4b5vm\" (UID: \"7926ef86-b333-4a8b-98ca-7a71240e7702\") " pod="openshift-infra/auto-csr-approver-29568998-4b5vm" Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.420904 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t79vx\" (UniqueName: \"kubernetes.io/projected/1cb75cdb-9b93-44dd-9523-50b2c6f734d2-kube-api-access-t79vx\") pod \"infrawatch-operators-4fdg5\" (UID: \"1cb75cdb-9b93-44dd-9523-50b2c6f734d2\") " pod="service-telemetry/infrawatch-operators-4fdg5" Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.422140 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qsrm\" (UniqueName: \"kubernetes.io/projected/7926ef86-b333-4a8b-98ca-7a71240e7702-kube-api-access-2qsrm\") pod \"auto-csr-approver-29568998-4b5vm\" (UID: \"7926ef86-b333-4a8b-98ca-7a71240e7702\") " pod="openshift-infra/auto-csr-approver-29568998-4b5vm" Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.477788 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-4fdg5" Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.566136 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568998-4b5vm" Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.898775 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-4fdg5"] Mar 22 00:38:00 crc kubenswrapper[5116]: W0322 00:38:00.901032 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cb75cdb_9b93_44dd_9523_50b2c6f734d2.slice/crio-9d6c337411f7f14fd1031f34c36365fcdfe2d4ba51c5292c0732e26942495cb1 WatchSource:0}: Error finding container 9d6c337411f7f14fd1031f34c36365fcdfe2d4ba51c5292c0732e26942495cb1: Status 404 returned error can't find the container with id 9d6c337411f7f14fd1031f34c36365fcdfe2d4ba51c5292c0732e26942495cb1 Mar 22 00:38:01 crc kubenswrapper[5116]: I0322 00:38:01.003520 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568998-4b5vm"] Mar 22 00:38:01 crc kubenswrapper[5116]: W0322 00:38:01.008719 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7926ef86_b333_4a8b_98ca_7a71240e7702.slice/crio-a87247cd2f29ccba482b91b2cddea5ec46b037133c222a2f869327986bd74871 WatchSource:0}: Error finding container a87247cd2f29ccba482b91b2cddea5ec46b037133c222a2f869327986bd74871: Status 404 returned error can't find the container with id a87247cd2f29ccba482b91b2cddea5ec46b037133c222a2f869327986bd74871 Mar 22 00:38:01 crc kubenswrapper[5116]: I0322 00:38:01.697309 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:38:01 crc kubenswrapper[5116]: E0322 00:38:01.697959 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:38:01 crc kubenswrapper[5116]: I0322 00:38:01.825247 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568998-4b5vm" event={"ID":"7926ef86-b333-4a8b-98ca-7a71240e7702","Type":"ContainerStarted","Data":"a87247cd2f29ccba482b91b2cddea5ec46b037133c222a2f869327986bd74871"} Mar 22 00:38:01 crc kubenswrapper[5116]: I0322 00:38:01.826903 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-4fdg5" event={"ID":"1cb75cdb-9b93-44dd-9523-50b2c6f734d2","Type":"ContainerStarted","Data":"581a79763b7f807a72364516169da279c0228c493b8b44c0b4d36b8aa3f5c213"} Mar 22 00:38:01 crc kubenswrapper[5116]: I0322 00:38:01.826968 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-4fdg5" event={"ID":"1cb75cdb-9b93-44dd-9523-50b2c6f734d2","Type":"ContainerStarted","Data":"9d6c337411f7f14fd1031f34c36365fcdfe2d4ba51c5292c0732e26942495cb1"} Mar 22 00:38:01 crc kubenswrapper[5116]: I0322 00:38:01.847692 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-4fdg5" podStartSLOduration=1.681940601 podStartE2EDuration="1.84766172s" podCreationTimestamp="2026-03-22 00:38:00 +0000 UTC" firstStartedPulling="2026-03-22 00:38:00.90246418 +0000 UTC m=+1751.924765553" lastFinishedPulling="2026-03-22 00:38:01.068185299 +0000 UTC m=+1752.090486672" observedRunningTime="2026-03-22 00:38:01.840067417 +0000 UTC m=+1752.862368830" watchObservedRunningTime="2026-03-22 00:38:01.84766172 +0000 UTC m=+1752.869963133" Mar 22 00:38:02 crc kubenswrapper[5116]: I0322 00:38:02.837146 5116 generic.go:358] "Generic (PLEG): container finished" podID="7926ef86-b333-4a8b-98ca-7a71240e7702" containerID="f1799b324b05b5da6abc7fe5f1b11bb1a37a71d6056bbd615181dd99a25ef6af" exitCode=0 Mar 22 00:38:02 crc kubenswrapper[5116]: I0322 00:38:02.837318 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568998-4b5vm" event={"ID":"7926ef86-b333-4a8b-98ca-7a71240e7702","Type":"ContainerDied","Data":"f1799b324b05b5da6abc7fe5f1b11bb1a37a71d6056bbd615181dd99a25ef6af"} Mar 22 00:38:04 crc kubenswrapper[5116]: I0322 00:38:04.168360 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568998-4b5vm" Mar 22 00:38:04 crc kubenswrapper[5116]: I0322 00:38:04.257510 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qsrm\" (UniqueName: \"kubernetes.io/projected/7926ef86-b333-4a8b-98ca-7a71240e7702-kube-api-access-2qsrm\") pod \"7926ef86-b333-4a8b-98ca-7a71240e7702\" (UID: \"7926ef86-b333-4a8b-98ca-7a71240e7702\") " Mar 22 00:38:04 crc kubenswrapper[5116]: I0322 00:38:04.272138 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7926ef86-b333-4a8b-98ca-7a71240e7702-kube-api-access-2qsrm" (OuterVolumeSpecName: "kube-api-access-2qsrm") pod "7926ef86-b333-4a8b-98ca-7a71240e7702" (UID: "7926ef86-b333-4a8b-98ca-7a71240e7702"). InnerVolumeSpecName "kube-api-access-2qsrm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:38:04 crc kubenswrapper[5116]: I0322 00:38:04.359087 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2qsrm\" (UniqueName: \"kubernetes.io/projected/7926ef86-b333-4a8b-98ca-7a71240e7702-kube-api-access-2qsrm\") on node \"crc\" DevicePath \"\"" Mar 22 00:38:04 crc kubenswrapper[5116]: I0322 00:38:04.868549 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568998-4b5vm" Mar 22 00:38:04 crc kubenswrapper[5116]: I0322 00:38:04.868654 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568998-4b5vm" event={"ID":"7926ef86-b333-4a8b-98ca-7a71240e7702","Type":"ContainerDied","Data":"a87247cd2f29ccba482b91b2cddea5ec46b037133c222a2f869327986bd74871"} Mar 22 00:38:04 crc kubenswrapper[5116]: I0322 00:38:04.868692 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a87247cd2f29ccba482b91b2cddea5ec46b037133c222a2f869327986bd74871" Mar 22 00:38:05 crc kubenswrapper[5116]: I0322 00:38:05.244045 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29568992-2pdr9"] Mar 22 00:38:05 crc kubenswrapper[5116]: I0322 00:38:05.254842 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29568992-2pdr9"] Mar 22 00:38:05 crc kubenswrapper[5116]: I0322 00:38:05.706517 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a9e5029-c0b6-4edb-a790-d67f5791bab2" path="/var/lib/kubelet/pods/6a9e5029-c0b6-4edb-a790-d67f5791bab2/volumes" Mar 22 00:38:10 crc kubenswrapper[5116]: I0322 00:38:10.478838 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/infrawatch-operators-4fdg5" Mar 22 00:38:10 crc kubenswrapper[5116]: I0322 00:38:10.479437 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="service-telemetry/infrawatch-operators-4fdg5" Mar 22 00:38:10 crc kubenswrapper[5116]: I0322 00:38:10.520156 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/infrawatch-operators-4fdg5" Mar 22 00:38:10 crc kubenswrapper[5116]: I0322 00:38:10.951661 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/infrawatch-operators-4fdg5" Mar 22 00:38:10 crc kubenswrapper[5116]: I0322 00:38:10.992560 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-4fdg5"] Mar 22 00:38:12 crc kubenswrapper[5116]: I0322 00:38:12.933094 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="service-telemetry/infrawatch-operators-4fdg5" podUID="1cb75cdb-9b93-44dd-9523-50b2c6f734d2" containerName="registry-server" containerID="cri-o://581a79763b7f807a72364516169da279c0228c493b8b44c0b4d36b8aa3f5c213" gracePeriod=2 Mar 22 00:38:13 crc kubenswrapper[5116]: I0322 00:38:13.341634 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-4fdg5" Mar 22 00:38:13 crc kubenswrapper[5116]: I0322 00:38:13.373240 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t79vx\" (UniqueName: \"kubernetes.io/projected/1cb75cdb-9b93-44dd-9523-50b2c6f734d2-kube-api-access-t79vx\") pod \"1cb75cdb-9b93-44dd-9523-50b2c6f734d2\" (UID: \"1cb75cdb-9b93-44dd-9523-50b2c6f734d2\") " Mar 22 00:38:13 crc kubenswrapper[5116]: I0322 00:38:13.394454 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cb75cdb-9b93-44dd-9523-50b2c6f734d2-kube-api-access-t79vx" (OuterVolumeSpecName: "kube-api-access-t79vx") pod "1cb75cdb-9b93-44dd-9523-50b2c6f734d2" (UID: "1cb75cdb-9b93-44dd-9523-50b2c6f734d2"). InnerVolumeSpecName "kube-api-access-t79vx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:38:13 crc kubenswrapper[5116]: I0322 00:38:13.474688 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t79vx\" (UniqueName: \"kubernetes.io/projected/1cb75cdb-9b93-44dd-9523-50b2c6f734d2-kube-api-access-t79vx\") on node \"crc\" DevicePath \"\"" Mar 22 00:38:13 crc kubenswrapper[5116]: I0322 00:38:13.941742 5116 generic.go:358] "Generic (PLEG): container finished" podID="1cb75cdb-9b93-44dd-9523-50b2c6f734d2" containerID="581a79763b7f807a72364516169da279c0228c493b8b44c0b4d36b8aa3f5c213" exitCode=0 Mar 22 00:38:13 crc kubenswrapper[5116]: I0322 00:38:13.941865 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-4fdg5" event={"ID":"1cb75cdb-9b93-44dd-9523-50b2c6f734d2","Type":"ContainerDied","Data":"581a79763b7f807a72364516169da279c0228c493b8b44c0b4d36b8aa3f5c213"} Mar 22 00:38:13 crc kubenswrapper[5116]: I0322 00:38:13.941883 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-4fdg5" Mar 22 00:38:13 crc kubenswrapper[5116]: I0322 00:38:13.941914 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-4fdg5" event={"ID":"1cb75cdb-9b93-44dd-9523-50b2c6f734d2","Type":"ContainerDied","Data":"9d6c337411f7f14fd1031f34c36365fcdfe2d4ba51c5292c0732e26942495cb1"} Mar 22 00:38:13 crc kubenswrapper[5116]: I0322 00:38:13.941936 5116 scope.go:117] "RemoveContainer" containerID="581a79763b7f807a72364516169da279c0228c493b8b44c0b4d36b8aa3f5c213" Mar 22 00:38:13 crc kubenswrapper[5116]: I0322 00:38:13.963772 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-4fdg5"] Mar 22 00:38:13 crc kubenswrapper[5116]: I0322 00:38:13.969629 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["service-telemetry/infrawatch-operators-4fdg5"] Mar 22 00:38:13 crc kubenswrapper[5116]: I0322 00:38:13.970294 5116 scope.go:117] "RemoveContainer" containerID="581a79763b7f807a72364516169da279c0228c493b8b44c0b4d36b8aa3f5c213" Mar 22 00:38:13 crc kubenswrapper[5116]: E0322 00:38:13.972771 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"581a79763b7f807a72364516169da279c0228c493b8b44c0b4d36b8aa3f5c213\": container with ID starting with 581a79763b7f807a72364516169da279c0228c493b8b44c0b4d36b8aa3f5c213 not found: ID does not exist" containerID="581a79763b7f807a72364516169da279c0228c493b8b44c0b4d36b8aa3f5c213" Mar 22 00:38:13 crc kubenswrapper[5116]: I0322 00:38:13.972815 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"581a79763b7f807a72364516169da279c0228c493b8b44c0b4d36b8aa3f5c213"} err="failed to get container status \"581a79763b7f807a72364516169da279c0228c493b8b44c0b4d36b8aa3f5c213\": rpc error: code = NotFound desc = could not find container \"581a79763b7f807a72364516169da279c0228c493b8b44c0b4d36b8aa3f5c213\": container with ID starting with 581a79763b7f807a72364516169da279c0228c493b8b44c0b4d36b8aa3f5c213 not found: ID does not exist" Mar 22 00:38:14 crc kubenswrapper[5116]: I0322 00:38:14.697363 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:38:14 crc kubenswrapper[5116]: E0322 00:38:14.697778 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:38:15 crc kubenswrapper[5116]: I0322 00:38:15.709843 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cb75cdb-9b93-44dd-9523-50b2c6f734d2" path="/var/lib/kubelet/pods/1cb75cdb-9b93-44dd-9523-50b2c6f734d2/volumes" Mar 22 00:38:19 crc kubenswrapper[5116]: I0322 00:38:19.514700 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-75ffdb6fcd-m65lb_0910a8a1-0226-42c8-ab1d-b142d2b7a00d/control-plane-machine-set-operator/0.log" Mar 22 00:38:19 crc kubenswrapper[5116]: I0322 00:38:19.639559 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-755bb95488-w2nq2_9884d9ba-fbeb-40db-8105-de302262478b/kube-rbac-proxy/0.log" Mar 22 00:38:19 crc kubenswrapper[5116]: I0322 00:38:19.684571 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-755bb95488-w2nq2_9884d9ba-fbeb-40db-8105-de302262478b/machine-api-operator/0.log" Mar 22 00:38:28 crc kubenswrapper[5116]: I0322 00:38:28.697558 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:38:28 crc kubenswrapper[5116]: E0322 00:38:28.698679 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:38:32 crc kubenswrapper[5116]: I0322 00:38:32.562389 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-759f64656b-nt75l_e6d082fa-fedb-4089-87be-6bd1f0922f14/cert-manager-controller/0.log" Mar 22 00:38:32 crc kubenswrapper[5116]: I0322 00:38:32.671359 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-8966b78d4-mnbt9_cee1607e-0fc4-4dfb-bcbb-a2b8d9636b69/cert-manager-cainjector/0.log" Mar 22 00:38:32 crc kubenswrapper[5116]: I0322 00:38:32.754517 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-597b96b99b-8tskc_57a02b12-f3d8-4264-af22-4fb7bc40602f/cert-manager-webhook/0.log" Mar 22 00:38:42 crc kubenswrapper[5116]: I0322 00:38:42.698033 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:38:42 crc kubenswrapper[5116]: E0322 00:38:42.698675 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:38:48 crc kubenswrapper[5116]: I0322 00:38:48.051004 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-55568fc96c-krbrc_1a434146-4e47-4733-9f73-955a4c92f2d2/prometheus-operator/0.log" Mar 22 00:38:48 crc kubenswrapper[5116]: I0322 00:38:48.163640 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd_399d0e55-3ad7-48ad-ab17-d0ab1fb9879f/prometheus-operator-admission-webhook/0.log" Mar 22 00:38:48 crc kubenswrapper[5116]: I0322 00:38:48.265880 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf_2d0f143c-b305-43e1-937e-020d84101219/prometheus-operator-admission-webhook/0.log" Mar 22 00:38:48 crc kubenswrapper[5116]: I0322 00:38:48.348487 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-587f9c8867-sxrpm_b998a8ef-dbc2-4004-a589-608b0bf774e7/operator/0.log" Mar 22 00:38:48 crc kubenswrapper[5116]: I0322 00:38:48.478367 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bff5dbc55-tpg7b_9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8/perses-operator/0.log" Mar 22 00:38:54 crc kubenswrapper[5116]: I0322 00:38:54.698244 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:38:54 crc kubenswrapper[5116]: E0322 00:38:54.698897 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:38:56 crc kubenswrapper[5116]: I0322 00:38:56.231897 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9sq6c_5188f25b-37c3-46f1-b939-199c6e082848/kube-multus/0.log" Mar 22 00:38:56 crc kubenswrapper[5116]: I0322 00:38:56.232004 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9sq6c_5188f25b-37c3-46f1-b939-199c6e082848/kube-multus/0.log" Mar 22 00:38:56 crc kubenswrapper[5116]: I0322 00:38:56.239904 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 22 00:38:56 crc kubenswrapper[5116]: I0322 00:38:56.239981 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 22 00:38:58 crc kubenswrapper[5116]: I0322 00:38:58.307985 5116 scope.go:117] "RemoveContainer" containerID="b895d59dbbcf506c3c0f496201be78280c2900728c487398d81316e12e931fac" Mar 22 00:39:03 crc kubenswrapper[5116]: I0322 00:39:03.433425 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn_c5f79273-4f52-4d9f-ab31-5af0123ff34c/util/0.log" Mar 22 00:39:03 crc kubenswrapper[5116]: I0322 00:39:03.586036 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn_c5f79273-4f52-4d9f-ab31-5af0123ff34c/util/0.log" Mar 22 00:39:03 crc kubenswrapper[5116]: I0322 00:39:03.617094 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn_c5f79273-4f52-4d9f-ab31-5af0123ff34c/pull/0.log" Mar 22 00:39:03 crc kubenswrapper[5116]: I0322 00:39:03.617122 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn_c5f79273-4f52-4d9f-ab31-5af0123ff34c/pull/0.log" Mar 22 00:39:03 crc kubenswrapper[5116]: I0322 00:39:03.787098 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn_c5f79273-4f52-4d9f-ab31-5af0123ff34c/pull/0.log" Mar 22 00:39:03 crc kubenswrapper[5116]: I0322 00:39:03.798523 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn_c5f79273-4f52-4d9f-ab31-5af0123ff34c/util/0.log" Mar 22 00:39:03 crc kubenswrapper[5116]: I0322 00:39:03.803509 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn_c5f79273-4f52-4d9f-ab31-5af0123ff34c/extract/0.log" Mar 22 00:39:03 crc kubenswrapper[5116]: I0322 00:39:03.933311 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr_01e2c74b-adcf-45a0-ab9a-e7375676f470/util/0.log" Mar 22 00:39:04 crc kubenswrapper[5116]: I0322 00:39:04.104511 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr_01e2c74b-adcf-45a0-ab9a-e7375676f470/util/0.log" Mar 22 00:39:04 crc kubenswrapper[5116]: I0322 00:39:04.142457 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr_01e2c74b-adcf-45a0-ab9a-e7375676f470/pull/0.log" Mar 22 00:39:04 crc kubenswrapper[5116]: I0322 00:39:04.145323 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr_01e2c74b-adcf-45a0-ab9a-e7375676f470/pull/0.log" Mar 22 00:39:04 crc kubenswrapper[5116]: I0322 00:39:04.345734 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr_01e2c74b-adcf-45a0-ab9a-e7375676f470/pull/0.log" Mar 22 00:39:04 crc kubenswrapper[5116]: I0322 00:39:04.370208 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr_01e2c74b-adcf-45a0-ab9a-e7375676f470/extract/0.log" Mar 22 00:39:04 crc kubenswrapper[5116]: I0322 00:39:04.374279 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr_01e2c74b-adcf-45a0-ab9a-e7375676f470/util/0.log" Mar 22 00:39:04 crc kubenswrapper[5116]: I0322 00:39:04.485874 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5_7e8a1901-425e-4555-a4f0-fd2ae65d7fb8/util/0.log" Mar 22 00:39:04 crc kubenswrapper[5116]: I0322 00:39:04.667472 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5_7e8a1901-425e-4555-a4f0-fd2ae65d7fb8/util/0.log" Mar 22 00:39:04 crc kubenswrapper[5116]: I0322 00:39:04.697864 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5_7e8a1901-425e-4555-a4f0-fd2ae65d7fb8/pull/0.log" Mar 22 00:39:04 crc kubenswrapper[5116]: I0322 00:39:04.770724 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5_7e8a1901-425e-4555-a4f0-fd2ae65d7fb8/pull/0.log" Mar 22 00:39:04 crc kubenswrapper[5116]: I0322 00:39:04.870880 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5_7e8a1901-425e-4555-a4f0-fd2ae65d7fb8/util/0.log" Mar 22 00:39:04 crc kubenswrapper[5116]: I0322 00:39:04.896521 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5_7e8a1901-425e-4555-a4f0-fd2ae65d7fb8/pull/0.log" Mar 22 00:39:04 crc kubenswrapper[5116]: I0322 00:39:04.957506 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5_7e8a1901-425e-4555-a4f0-fd2ae65d7fb8/extract/0.log" Mar 22 00:39:05 crc kubenswrapper[5116]: I0322 00:39:05.069660 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48_dd47ba40-7b8e-4f2c-8e16-62a5f085def8/util/0.log" Mar 22 00:39:05 crc kubenswrapper[5116]: I0322 00:39:05.203601 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48_dd47ba40-7b8e-4f2c-8e16-62a5f085def8/pull/0.log" Mar 22 00:39:05 crc kubenswrapper[5116]: I0322 00:39:05.208733 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48_dd47ba40-7b8e-4f2c-8e16-62a5f085def8/util/0.log" Mar 22 00:39:05 crc kubenswrapper[5116]: I0322 00:39:05.217664 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48_dd47ba40-7b8e-4f2c-8e16-62a5f085def8/pull/0.log" Mar 22 00:39:05 crc kubenswrapper[5116]: I0322 00:39:05.367818 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48_dd47ba40-7b8e-4f2c-8e16-62a5f085def8/extract/0.log" Mar 22 00:39:05 crc kubenswrapper[5116]: I0322 00:39:05.497391 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48_dd47ba40-7b8e-4f2c-8e16-62a5f085def8/util/0.log" Mar 22 00:39:05 crc kubenswrapper[5116]: I0322 00:39:05.512151 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48_dd47ba40-7b8e-4f2c-8e16-62a5f085def8/pull/0.log" Mar 22 00:39:05 crc kubenswrapper[5116]: I0322 00:39:05.638747 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qtstp_fb40c619-b024-485e-8fab-590cf66159b3/extract-utilities/0.log" Mar 22 00:39:05 crc kubenswrapper[5116]: I0322 00:39:05.792268 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qtstp_fb40c619-b024-485e-8fab-590cf66159b3/extract-utilities/0.log" Mar 22 00:39:05 crc kubenswrapper[5116]: I0322 00:39:05.797328 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qtstp_fb40c619-b024-485e-8fab-590cf66159b3/extract-content/0.log" Mar 22 00:39:05 crc kubenswrapper[5116]: I0322 00:39:05.806598 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qtstp_fb40c619-b024-485e-8fab-590cf66159b3/extract-content/0.log" Mar 22 00:39:06 crc kubenswrapper[5116]: I0322 00:39:06.050241 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qtstp_fb40c619-b024-485e-8fab-590cf66159b3/extract-utilities/0.log" Mar 22 00:39:06 crc kubenswrapper[5116]: I0322 00:39:06.108602 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qtstp_fb40c619-b024-485e-8fab-590cf66159b3/extract-content/0.log" Mar 22 00:39:06 crc kubenswrapper[5116]: I0322 00:39:06.271865 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wkk2b_9cc37111-4983-4dbc-a277-b77d2fc47508/extract-utilities/0.log" Mar 22 00:39:06 crc kubenswrapper[5116]: I0322 00:39:06.275583 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qtstp_fb40c619-b024-485e-8fab-590cf66159b3/registry-server/0.log" Mar 22 00:39:06 crc kubenswrapper[5116]: I0322 00:39:06.390051 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wkk2b_9cc37111-4983-4dbc-a277-b77d2fc47508/extract-utilities/0.log" Mar 22 00:39:06 crc kubenswrapper[5116]: I0322 00:39:06.413577 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wkk2b_9cc37111-4983-4dbc-a277-b77d2fc47508/extract-content/0.log" Mar 22 00:39:06 crc kubenswrapper[5116]: I0322 00:39:06.449219 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wkk2b_9cc37111-4983-4dbc-a277-b77d2fc47508/extract-content/0.log" Mar 22 00:39:06 crc kubenswrapper[5116]: I0322 00:39:06.554796 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wkk2b_9cc37111-4983-4dbc-a277-b77d2fc47508/extract-utilities/0.log" Mar 22 00:39:06 crc kubenswrapper[5116]: I0322 00:39:06.568101 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wkk2b_9cc37111-4983-4dbc-a277-b77d2fc47508/extract-content/0.log" Mar 22 00:39:06 crc kubenswrapper[5116]: I0322 00:39:06.655296 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-547dbd544d-c8r5t_ba8a6a03-e32a-4121-86e1-d856ddf7a73b/marketplace-operator/0.log" Mar 22 00:39:06 crc kubenswrapper[5116]: I0322 00:39:06.790119 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v85n6_132f688e-74fb-4bbb-844a-a23467633e19/extract-utilities/0.log" Mar 22 00:39:06 crc kubenswrapper[5116]: I0322 00:39:06.863845 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wkk2b_9cc37111-4983-4dbc-a277-b77d2fc47508/registry-server/0.log" Mar 22 00:39:06 crc kubenswrapper[5116]: I0322 00:39:06.973671 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v85n6_132f688e-74fb-4bbb-844a-a23467633e19/extract-utilities/0.log" Mar 22 00:39:06 crc kubenswrapper[5116]: I0322 00:39:06.993747 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v85n6_132f688e-74fb-4bbb-844a-a23467633e19/extract-content/0.log" Mar 22 00:39:06 crc kubenswrapper[5116]: I0322 00:39:06.998190 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v85n6_132f688e-74fb-4bbb-844a-a23467633e19/extract-content/0.log" Mar 22 00:39:07 crc kubenswrapper[5116]: I0322 00:39:07.152797 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v85n6_132f688e-74fb-4bbb-844a-a23467633e19/extract-content/0.log" Mar 22 00:39:07 crc kubenswrapper[5116]: I0322 00:39:07.162515 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v85n6_132f688e-74fb-4bbb-844a-a23467633e19/extract-utilities/0.log" Mar 22 00:39:07 crc kubenswrapper[5116]: I0322 00:39:07.375603 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v85n6_132f688e-74fb-4bbb-844a-a23467633e19/registry-server/0.log" Mar 22 00:39:07 crc kubenswrapper[5116]: I0322 00:39:07.697440 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:39:07 crc kubenswrapper[5116]: E0322 00:39:07.698773 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:39:21 crc kubenswrapper[5116]: I0322 00:39:21.706345 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:39:21 crc kubenswrapper[5116]: E0322 00:39:21.707216 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:39:21 crc kubenswrapper[5116]: I0322 00:39:21.763275 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-55568fc96c-krbrc_1a434146-4e47-4733-9f73-955a4c92f2d2/prometheus-operator/0.log" Mar 22 00:39:21 crc kubenswrapper[5116]: I0322 00:39:21.780649 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf_2d0f143c-b305-43e1-937e-020d84101219/prometheus-operator-admission-webhook/0.log" Mar 22 00:39:21 crc kubenswrapper[5116]: I0322 00:39:21.787451 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd_399d0e55-3ad7-48ad-ab17-d0ab1fb9879f/prometheus-operator-admission-webhook/0.log" Mar 22 00:39:21 crc kubenswrapper[5116]: I0322 00:39:21.882719 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-587f9c8867-sxrpm_b998a8ef-dbc2-4004-a589-608b0bf774e7/operator/0.log" Mar 22 00:39:21 crc kubenswrapper[5116]: I0322 00:39:21.914941 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bff5dbc55-tpg7b_9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8/perses-operator/0.log" Mar 22 00:39:35 crc kubenswrapper[5116]: I0322 00:39:35.698333 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:39:36 crc kubenswrapper[5116]: I0322 00:39:36.666256 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerStarted","Data":"495f94902224bb639caa02902e55778bab7314c3bab4e573936b06f73f196f83"} Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.145499 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29569000-4mj96"] Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.146898 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7926ef86-b333-4a8b-98ca-7a71240e7702" containerName="oc" Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.146915 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="7926ef86-b333-4a8b-98ca-7a71240e7702" containerName="oc" Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.146946 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1cb75cdb-9b93-44dd-9523-50b2c6f734d2" containerName="registry-server" Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.146954 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cb75cdb-9b93-44dd-9523-50b2c6f734d2" containerName="registry-server" Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.147095 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="1cb75cdb-9b93-44dd-9523-50b2c6f734d2" containerName="registry-server" Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.147115 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="7926ef86-b333-4a8b-98ca-7a71240e7702" containerName="oc" Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.152299 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29569000-4mj96" Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.155217 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.156422 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-zsw2q\"" Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.161598 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.168317 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29569000-4mj96"] Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.286026 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg95k\" (UniqueName: \"kubernetes.io/projected/2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2-kube-api-access-tg95k\") pod \"auto-csr-approver-29569000-4mj96\" (UID: \"2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2\") " pod="openshift-infra/auto-csr-approver-29569000-4mj96" Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.387434 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tg95k\" (UniqueName: \"kubernetes.io/projected/2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2-kube-api-access-tg95k\") pod \"auto-csr-approver-29569000-4mj96\" (UID: \"2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2\") " pod="openshift-infra/auto-csr-approver-29569000-4mj96" Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.413601 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg95k\" (UniqueName: \"kubernetes.io/projected/2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2-kube-api-access-tg95k\") pod \"auto-csr-approver-29569000-4mj96\" (UID: \"2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2\") " pod="openshift-infra/auto-csr-approver-29569000-4mj96" Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.492931 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29569000-4mj96" Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.727299 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29569000-4mj96"] Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.729872 5116 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.914909 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29569000-4mj96" event={"ID":"2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2","Type":"ContainerStarted","Data":"a166a3f12307eda4d9559f27f7cae2f024cea0472280b827ba50dca2b4dbe2b5"} Mar 22 00:40:02 crc kubenswrapper[5116]: I0322 00:40:02.935663 5116 generic.go:358] "Generic (PLEG): container finished" podID="2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2" containerID="7c29c71b7a58a810ccca621c67305721af7b2030d4cd67fc359b6cb2c54817ad" exitCode=0 Mar 22 00:40:02 crc kubenswrapper[5116]: I0322 00:40:02.936295 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29569000-4mj96" event={"ID":"2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2","Type":"ContainerDied","Data":"7c29c71b7a58a810ccca621c67305721af7b2030d4cd67fc359b6cb2c54817ad"} Mar 22 00:40:04 crc kubenswrapper[5116]: I0322 00:40:04.312901 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29569000-4mj96" Mar 22 00:40:04 crc kubenswrapper[5116]: I0322 00:40:04.455839 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tg95k\" (UniqueName: \"kubernetes.io/projected/2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2-kube-api-access-tg95k\") pod \"2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2\" (UID: \"2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2\") " Mar 22 00:40:04 crc kubenswrapper[5116]: I0322 00:40:04.464844 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2-kube-api-access-tg95k" (OuterVolumeSpecName: "kube-api-access-tg95k") pod "2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2" (UID: "2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2"). InnerVolumeSpecName "kube-api-access-tg95k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:40:04 crc kubenswrapper[5116]: I0322 00:40:04.557206 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tg95k\" (UniqueName: \"kubernetes.io/projected/2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2-kube-api-access-tg95k\") on node \"crc\" DevicePath \"\"" Mar 22 00:40:04 crc kubenswrapper[5116]: I0322 00:40:04.969841 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29569000-4mj96" event={"ID":"2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2","Type":"ContainerDied","Data":"a166a3f12307eda4d9559f27f7cae2f024cea0472280b827ba50dca2b4dbe2b5"} Mar 22 00:40:04 crc kubenswrapper[5116]: I0322 00:40:04.970304 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a166a3f12307eda4d9559f27f7cae2f024cea0472280b827ba50dca2b4dbe2b5" Mar 22 00:40:04 crc kubenswrapper[5116]: I0322 00:40:04.969979 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29569000-4mj96" Mar 22 00:40:05 crc kubenswrapper[5116]: I0322 00:40:05.413008 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29568994-tx27l"] Mar 22 00:40:05 crc kubenswrapper[5116]: I0322 00:40:05.423229 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29568994-tx27l"] Mar 22 00:40:05 crc kubenswrapper[5116]: I0322 00:40:05.707312 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7f4a688-3ca1-4538-9b16-323899848ec1" path="/var/lib/kubelet/pods/a7f4a688-3ca1-4538-9b16-323899848ec1/volumes" Mar 22 00:40:05 crc kubenswrapper[5116]: I0322 00:40:05.983241 5116 generic.go:358] "Generic (PLEG): container finished" podID="4fa843c6-f428-4c50-8fb6-58db4910b8d8" containerID="d73ca44ab765dc4012a9020a414c305067b5524903bd0e1b9b94504288d3f864" exitCode=0 Mar 22 00:40:05 crc kubenswrapper[5116]: I0322 00:40:05.983322 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mmn2b/must-gather-pxgv6" event={"ID":"4fa843c6-f428-4c50-8fb6-58db4910b8d8","Type":"ContainerDied","Data":"d73ca44ab765dc4012a9020a414c305067b5524903bd0e1b9b94504288d3f864"} Mar 22 00:40:05 crc kubenswrapper[5116]: I0322 00:40:05.984049 5116 scope.go:117] "RemoveContainer" containerID="d73ca44ab765dc4012a9020a414c305067b5524903bd0e1b9b94504288d3f864" Mar 22 00:40:06 crc kubenswrapper[5116]: I0322 00:40:06.416422 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mmn2b_must-gather-pxgv6_4fa843c6-f428-4c50-8fb6-58db4910b8d8/gather/0.log" Mar 22 00:40:12 crc kubenswrapper[5116]: I0322 00:40:12.767026 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mmn2b/must-gather-pxgv6"] Mar 22 00:40:12 crc kubenswrapper[5116]: I0322 00:40:12.769898 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-must-gather-mmn2b/must-gather-pxgv6" podUID="4fa843c6-f428-4c50-8fb6-58db4910b8d8" containerName="copy" containerID="cri-o://c58c0ca6ac202eff124ef375a1ba501807cfd8934d46c29180611db4363c333e" gracePeriod=2 Mar 22 00:40:12 crc kubenswrapper[5116]: I0322 00:40:12.773448 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mmn2b/must-gather-pxgv6"] Mar 22 00:40:12 crc kubenswrapper[5116]: I0322 00:40:12.774759 5116 status_manager.go:895] "Failed to get status for pod" podUID="4fa843c6-f428-4c50-8fb6-58db4910b8d8" pod="openshift-must-gather-mmn2b/must-gather-pxgv6" err="pods \"must-gather-pxgv6\" is forbidden: User \"system:node:crc\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-mmn2b\": no relationship found between node 'crc' and this object" Mar 22 00:40:13 crc kubenswrapper[5116]: I0322 00:40:13.067435 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mmn2b_must-gather-pxgv6_4fa843c6-f428-4c50-8fb6-58db4910b8d8/copy/0.log" Mar 22 00:40:13 crc kubenswrapper[5116]: I0322 00:40:13.068694 5116 generic.go:358] "Generic (PLEG): container finished" podID="4fa843c6-f428-4c50-8fb6-58db4910b8d8" containerID="c58c0ca6ac202eff124ef375a1ba501807cfd8934d46c29180611db4363c333e" exitCode=143 Mar 22 00:40:13 crc kubenswrapper[5116]: I0322 00:40:13.186207 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mmn2b_must-gather-pxgv6_4fa843c6-f428-4c50-8fb6-58db4910b8d8/copy/0.log" Mar 22 00:40:13 crc kubenswrapper[5116]: I0322 00:40:13.186588 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mmn2b/must-gather-pxgv6" Mar 22 00:40:13 crc kubenswrapper[5116]: I0322 00:40:13.187973 5116 status_manager.go:895] "Failed to get status for pod" podUID="4fa843c6-f428-4c50-8fb6-58db4910b8d8" pod="openshift-must-gather-mmn2b/must-gather-pxgv6" err="pods \"must-gather-pxgv6\" is forbidden: User \"system:node:crc\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-mmn2b\": no relationship found between node 'crc' and this object" Mar 22 00:40:13 crc kubenswrapper[5116]: I0322 00:40:13.320412 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4fa843c6-f428-4c50-8fb6-58db4910b8d8-must-gather-output\") pod \"4fa843c6-f428-4c50-8fb6-58db4910b8d8\" (UID: \"4fa843c6-f428-4c50-8fb6-58db4910b8d8\") " Mar 22 00:40:13 crc kubenswrapper[5116]: I0322 00:40:13.320539 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czwdw\" (UniqueName: \"kubernetes.io/projected/4fa843c6-f428-4c50-8fb6-58db4910b8d8-kube-api-access-czwdw\") pod \"4fa843c6-f428-4c50-8fb6-58db4910b8d8\" (UID: \"4fa843c6-f428-4c50-8fb6-58db4910b8d8\") " Mar 22 00:40:13 crc kubenswrapper[5116]: I0322 00:40:13.326395 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fa843c6-f428-4c50-8fb6-58db4910b8d8-kube-api-access-czwdw" (OuterVolumeSpecName: "kube-api-access-czwdw") pod "4fa843c6-f428-4c50-8fb6-58db4910b8d8" (UID: "4fa843c6-f428-4c50-8fb6-58db4910b8d8"). InnerVolumeSpecName "kube-api-access-czwdw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:40:13 crc kubenswrapper[5116]: I0322 00:40:13.373196 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fa843c6-f428-4c50-8fb6-58db4910b8d8-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "4fa843c6-f428-4c50-8fb6-58db4910b8d8" (UID: "4fa843c6-f428-4c50-8fb6-58db4910b8d8"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:40:13 crc kubenswrapper[5116]: I0322 00:40:13.421963 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-czwdw\" (UniqueName: \"kubernetes.io/projected/4fa843c6-f428-4c50-8fb6-58db4910b8d8-kube-api-access-czwdw\") on node \"crc\" DevicePath \"\"" Mar 22 00:40:13 crc kubenswrapper[5116]: I0322 00:40:13.421997 5116 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4fa843c6-f428-4c50-8fb6-58db4910b8d8-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 22 00:40:13 crc kubenswrapper[5116]: I0322 00:40:13.706036 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fa843c6-f428-4c50-8fb6-58db4910b8d8" path="/var/lib/kubelet/pods/4fa843c6-f428-4c50-8fb6-58db4910b8d8/volumes" Mar 22 00:40:14 crc kubenswrapper[5116]: I0322 00:40:14.077621 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mmn2b_must-gather-pxgv6_4fa843c6-f428-4c50-8fb6-58db4910b8d8/copy/0.log" Mar 22 00:40:14 crc kubenswrapper[5116]: I0322 00:40:14.078726 5116 scope.go:117] "RemoveContainer" containerID="c58c0ca6ac202eff124ef375a1ba501807cfd8934d46c29180611db4363c333e" Mar 22 00:40:14 crc kubenswrapper[5116]: I0322 00:40:14.078857 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mmn2b/must-gather-pxgv6" Mar 22 00:40:14 crc kubenswrapper[5116]: I0322 00:40:14.095767 5116 scope.go:117] "RemoveContainer" containerID="d73ca44ab765dc4012a9020a414c305067b5524903bd0e1b9b94504288d3f864" Mar 22 00:40:58 crc kubenswrapper[5116]: I0322 00:40:58.493594 5116 scope.go:117] "RemoveContainer" containerID="1109d2350dd4889e7482fa6160c73f85e35ef58464b38a92e5a8ce5b9d9fcea4" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.128577 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h7bwj"] Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.129689 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4fa843c6-f428-4c50-8fb6-58db4910b8d8" containerName="copy" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.129701 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fa843c6-f428-4c50-8fb6-58db4910b8d8" containerName="copy" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.129727 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2" containerName="oc" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.129734 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2" containerName="oc" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.129751 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4fa843c6-f428-4c50-8fb6-58db4910b8d8" containerName="gather" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.129756 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fa843c6-f428-4c50-8fb6-58db4910b8d8" containerName="gather" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.129842 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="4fa843c6-f428-4c50-8fb6-58db4910b8d8" containerName="gather" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.129852 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="4fa843c6-f428-4c50-8fb6-58db4910b8d8" containerName="copy" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.129869 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2" containerName="oc" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.139959 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h7bwj" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.150535 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h7bwj"] Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.265606 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80416337-e9da-4316-8e19-60269a38f953-utilities\") pod \"certified-operators-h7bwj\" (UID: \"80416337-e9da-4316-8e19-60269a38f953\") " pod="openshift-marketplace/certified-operators-h7bwj" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.265691 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrg7g\" (UniqueName: \"kubernetes.io/projected/80416337-e9da-4316-8e19-60269a38f953-kube-api-access-jrg7g\") pod \"certified-operators-h7bwj\" (UID: \"80416337-e9da-4316-8e19-60269a38f953\") " pod="openshift-marketplace/certified-operators-h7bwj" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.265831 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80416337-e9da-4316-8e19-60269a38f953-catalog-content\") pod \"certified-operators-h7bwj\" (UID: \"80416337-e9da-4316-8e19-60269a38f953\") " pod="openshift-marketplace/certified-operators-h7bwj" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.306855 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hmpvb"] Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.314387 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hmpvb" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.318990 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hmpvb"] Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.367623 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80416337-e9da-4316-8e19-60269a38f953-utilities\") pod \"certified-operators-h7bwj\" (UID: \"80416337-e9da-4316-8e19-60269a38f953\") " pod="openshift-marketplace/certified-operators-h7bwj" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.367693 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jrg7g\" (UniqueName: \"kubernetes.io/projected/80416337-e9da-4316-8e19-60269a38f953-kube-api-access-jrg7g\") pod \"certified-operators-h7bwj\" (UID: \"80416337-e9da-4316-8e19-60269a38f953\") " pod="openshift-marketplace/certified-operators-h7bwj" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.367754 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80416337-e9da-4316-8e19-60269a38f953-catalog-content\") pod \"certified-operators-h7bwj\" (UID: \"80416337-e9da-4316-8e19-60269a38f953\") " pod="openshift-marketplace/certified-operators-h7bwj" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.368245 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80416337-e9da-4316-8e19-60269a38f953-utilities\") pod \"certified-operators-h7bwj\" (UID: \"80416337-e9da-4316-8e19-60269a38f953\") " pod="openshift-marketplace/certified-operators-h7bwj" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.368409 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80416337-e9da-4316-8e19-60269a38f953-catalog-content\") pod \"certified-operators-h7bwj\" (UID: \"80416337-e9da-4316-8e19-60269a38f953\") " pod="openshift-marketplace/certified-operators-h7bwj" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.389513 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrg7g\" (UniqueName: \"kubernetes.io/projected/80416337-e9da-4316-8e19-60269a38f953-kube-api-access-jrg7g\") pod \"certified-operators-h7bwj\" (UID: \"80416337-e9da-4316-8e19-60269a38f953\") " pod="openshift-marketplace/certified-operators-h7bwj" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.469649 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb7db85b-fa66-42db-bf71-1fe9b38656d0-utilities\") pod \"redhat-operators-hmpvb\" (UID: \"bb7db85b-fa66-42db-bf71-1fe9b38656d0\") " pod="openshift-marketplace/redhat-operators-hmpvb" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.469780 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqw7l\" (UniqueName: \"kubernetes.io/projected/bb7db85b-fa66-42db-bf71-1fe9b38656d0-kube-api-access-bqw7l\") pod \"redhat-operators-hmpvb\" (UID: \"bb7db85b-fa66-42db-bf71-1fe9b38656d0\") " pod="openshift-marketplace/redhat-operators-hmpvb" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.469850 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb7db85b-fa66-42db-bf71-1fe9b38656d0-catalog-content\") pod \"redhat-operators-hmpvb\" (UID: \"bb7db85b-fa66-42db-bf71-1fe9b38656d0\") " pod="openshift-marketplace/redhat-operators-hmpvb" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.493589 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h7bwj" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.571569 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bqw7l\" (UniqueName: \"kubernetes.io/projected/bb7db85b-fa66-42db-bf71-1fe9b38656d0-kube-api-access-bqw7l\") pod \"redhat-operators-hmpvb\" (UID: \"bb7db85b-fa66-42db-bf71-1fe9b38656d0\") " pod="openshift-marketplace/redhat-operators-hmpvb" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.571846 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb7db85b-fa66-42db-bf71-1fe9b38656d0-catalog-content\") pod \"redhat-operators-hmpvb\" (UID: \"bb7db85b-fa66-42db-bf71-1fe9b38656d0\") " pod="openshift-marketplace/redhat-operators-hmpvb" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.571953 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb7db85b-fa66-42db-bf71-1fe9b38656d0-utilities\") pod \"redhat-operators-hmpvb\" (UID: \"bb7db85b-fa66-42db-bf71-1fe9b38656d0\") " pod="openshift-marketplace/redhat-operators-hmpvb" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.572477 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb7db85b-fa66-42db-bf71-1fe9b38656d0-utilities\") pod \"redhat-operators-hmpvb\" (UID: \"bb7db85b-fa66-42db-bf71-1fe9b38656d0\") " pod="openshift-marketplace/redhat-operators-hmpvb" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.572549 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb7db85b-fa66-42db-bf71-1fe9b38656d0-catalog-content\") pod \"redhat-operators-hmpvb\" (UID: \"bb7db85b-fa66-42db-bf71-1fe9b38656d0\") " pod="openshift-marketplace/redhat-operators-hmpvb" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.593906 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqw7l\" (UniqueName: \"kubernetes.io/projected/bb7db85b-fa66-42db-bf71-1fe9b38656d0-kube-api-access-bqw7l\") pod \"redhat-operators-hmpvb\" (UID: \"bb7db85b-fa66-42db-bf71-1fe9b38656d0\") " pod="openshift-marketplace/redhat-operators-hmpvb" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.636735 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hmpvb" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.868313 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hmpvb"] Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.991322 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h7bwj"] Mar 22 00:41:03 crc kubenswrapper[5116]: W0322 00:41:03.994889 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80416337_e9da_4316_8e19_60269a38f953.slice/crio-248003aa770056ad2c91a568b2f925b77180d76c45127cfcc7f5f08a1ea41521 WatchSource:0}: Error finding container 248003aa770056ad2c91a568b2f925b77180d76c45127cfcc7f5f08a1ea41521: Status 404 returned error can't find the container with id 248003aa770056ad2c91a568b2f925b77180d76c45127cfcc7f5f08a1ea41521 Mar 22 00:41:04 crc kubenswrapper[5116]: I0322 00:41:04.549726 5116 generic.go:358] "Generic (PLEG): container finished" podID="bb7db85b-fa66-42db-bf71-1fe9b38656d0" containerID="9d697664ac6e1358387eb796015fa0631e395dd7a2b335ec3240fa86e48761a2" exitCode=0 Mar 22 00:41:04 crc kubenswrapper[5116]: I0322 00:41:04.549805 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmpvb" event={"ID":"bb7db85b-fa66-42db-bf71-1fe9b38656d0","Type":"ContainerDied","Data":"9d697664ac6e1358387eb796015fa0631e395dd7a2b335ec3240fa86e48761a2"} Mar 22 00:41:04 crc kubenswrapper[5116]: I0322 00:41:04.550202 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmpvb" event={"ID":"bb7db85b-fa66-42db-bf71-1fe9b38656d0","Type":"ContainerStarted","Data":"c706029a199b983363e08ea2f49456a42ed0bfa1c14b04db0528466c7004cb3d"} Mar 22 00:41:04 crc kubenswrapper[5116]: I0322 00:41:04.554020 5116 generic.go:358] "Generic (PLEG): container finished" podID="80416337-e9da-4316-8e19-60269a38f953" containerID="74761d0d90a625a5e234b55e19cf91a539ebd7cec6dbfcc7e7db5eb028760967" exitCode=0 Mar 22 00:41:04 crc kubenswrapper[5116]: I0322 00:41:04.554108 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h7bwj" event={"ID":"80416337-e9da-4316-8e19-60269a38f953","Type":"ContainerDied","Data":"74761d0d90a625a5e234b55e19cf91a539ebd7cec6dbfcc7e7db5eb028760967"} Mar 22 00:41:04 crc kubenswrapper[5116]: I0322 00:41:04.554139 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h7bwj" event={"ID":"80416337-e9da-4316-8e19-60269a38f953","Type":"ContainerStarted","Data":"248003aa770056ad2c91a568b2f925b77180d76c45127cfcc7f5f08a1ea41521"} Mar 22 00:41:05 crc kubenswrapper[5116]: I0322 00:41:05.568752 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h7bwj" event={"ID":"80416337-e9da-4316-8e19-60269a38f953","Type":"ContainerStarted","Data":"a45b77b25e96df84084740fb37f0eef39490e2085a93a6843097dcc84407e693"} Mar 22 00:41:06 crc kubenswrapper[5116]: I0322 00:41:06.580664 5116 generic.go:358] "Generic (PLEG): container finished" podID="bb7db85b-fa66-42db-bf71-1fe9b38656d0" containerID="1437f50b92cfa38489e0ac8d4e9751b959af8284a6f09cd644fac44f01cbc172" exitCode=0 Mar 22 00:41:06 crc kubenswrapper[5116]: I0322 00:41:06.580809 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmpvb" event={"ID":"bb7db85b-fa66-42db-bf71-1fe9b38656d0","Type":"ContainerDied","Data":"1437f50b92cfa38489e0ac8d4e9751b959af8284a6f09cd644fac44f01cbc172"} Mar 22 00:41:06 crc kubenswrapper[5116]: I0322 00:41:06.587245 5116 generic.go:358] "Generic (PLEG): container finished" podID="80416337-e9da-4316-8e19-60269a38f953" containerID="a45b77b25e96df84084740fb37f0eef39490e2085a93a6843097dcc84407e693" exitCode=0 Mar 22 00:41:06 crc kubenswrapper[5116]: I0322 00:41:06.587615 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h7bwj" event={"ID":"80416337-e9da-4316-8e19-60269a38f953","Type":"ContainerDied","Data":"a45b77b25e96df84084740fb37f0eef39490e2085a93a6843097dcc84407e693"} Mar 22 00:41:07 crc kubenswrapper[5116]: I0322 00:41:07.599754 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmpvb" event={"ID":"bb7db85b-fa66-42db-bf71-1fe9b38656d0","Type":"ContainerStarted","Data":"a52f0f85ecfca25587ee4e799f45a417064887df99c53023b059ef8de18c69ec"} Mar 22 00:41:07 crc kubenswrapper[5116]: I0322 00:41:07.605307 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h7bwj" event={"ID":"80416337-e9da-4316-8e19-60269a38f953","Type":"ContainerStarted","Data":"cc9c92696096c656dafa1bb5262bfb4753319fa68fab3c09fff4d408f1c0c86f"} Mar 22 00:41:07 crc kubenswrapper[5116]: I0322 00:41:07.639676 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hmpvb" podStartSLOduration=3.800037667 podStartE2EDuration="4.639651474s" podCreationTimestamp="2026-03-22 00:41:03 +0000 UTC" firstStartedPulling="2026-03-22 00:41:04.550736913 +0000 UTC m=+1935.573038286" lastFinishedPulling="2026-03-22 00:41:05.39035071 +0000 UTC m=+1936.412652093" observedRunningTime="2026-03-22 00:41:07.621802607 +0000 UTC m=+1938.644104020" watchObservedRunningTime="2026-03-22 00:41:07.639651474 +0000 UTC m=+1938.661952857" Mar 22 00:41:13 crc kubenswrapper[5116]: I0322 00:41:13.493895 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h7bwj" Mar 22 00:41:13 crc kubenswrapper[5116]: I0322 00:41:13.494325 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/certified-operators-h7bwj" Mar 22 00:41:13 crc kubenswrapper[5116]: I0322 00:41:13.559456 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h7bwj" Mar 22 00:41:13 crc kubenswrapper[5116]: I0322 00:41:13.597809 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h7bwj" podStartSLOduration=9.867506081 podStartE2EDuration="10.597781571s" podCreationTimestamp="2026-03-22 00:41:03 +0000 UTC" firstStartedPulling="2026-03-22 00:41:04.555750666 +0000 UTC m=+1935.578052079" lastFinishedPulling="2026-03-22 00:41:05.286026196 +0000 UTC m=+1936.308327569" observedRunningTime="2026-03-22 00:41:07.644883706 +0000 UTC m=+1938.667185089" watchObservedRunningTime="2026-03-22 00:41:13.597781571 +0000 UTC m=+1944.620082954" Mar 22 00:41:13 crc kubenswrapper[5116]: I0322 00:41:13.637803 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hmpvb" Mar 22 00:41:13 crc kubenswrapper[5116]: I0322 00:41:13.637887 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-operators-hmpvb" Mar 22 00:41:13 crc kubenswrapper[5116]: I0322 00:41:13.712199 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h7bwj" Mar 22 00:41:13 crc kubenswrapper[5116]: I0322 00:41:13.812092 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h7bwj"] Mar 22 00:41:14 crc kubenswrapper[5116]: I0322 00:41:14.695613 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hmpvb" podUID="bb7db85b-fa66-42db-bf71-1fe9b38656d0" containerName="registry-server" probeResult="failure" output=< Mar 22 00:41:14 crc kubenswrapper[5116]: timeout: failed to connect service ":50051" within 1s Mar 22 00:41:14 crc kubenswrapper[5116]: > Mar 22 00:41:15 crc kubenswrapper[5116]: I0322 00:41:15.680487 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h7bwj" podUID="80416337-e9da-4316-8e19-60269a38f953" containerName="registry-server" containerID="cri-o://cc9c92696096c656dafa1bb5262bfb4753319fa68fab3c09fff4d408f1c0c86f" gracePeriod=2 Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.633544 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h7bwj" Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.689924 5116 generic.go:358] "Generic (PLEG): container finished" podID="80416337-e9da-4316-8e19-60269a38f953" containerID="cc9c92696096c656dafa1bb5262bfb4753319fa68fab3c09fff4d408f1c0c86f" exitCode=0 Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.690038 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h7bwj" Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.690051 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h7bwj" event={"ID":"80416337-e9da-4316-8e19-60269a38f953","Type":"ContainerDied","Data":"cc9c92696096c656dafa1bb5262bfb4753319fa68fab3c09fff4d408f1c0c86f"} Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.690124 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h7bwj" event={"ID":"80416337-e9da-4316-8e19-60269a38f953","Type":"ContainerDied","Data":"248003aa770056ad2c91a568b2f925b77180d76c45127cfcc7f5f08a1ea41521"} Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.690156 5116 scope.go:117] "RemoveContainer" containerID="cc9c92696096c656dafa1bb5262bfb4753319fa68fab3c09fff4d408f1c0c86f" Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.690513 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80416337-e9da-4316-8e19-60269a38f953-utilities\") pod \"80416337-e9da-4316-8e19-60269a38f953\" (UID: \"80416337-e9da-4316-8e19-60269a38f953\") " Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.690692 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80416337-e9da-4316-8e19-60269a38f953-catalog-content\") pod \"80416337-e9da-4316-8e19-60269a38f953\" (UID: \"80416337-e9da-4316-8e19-60269a38f953\") " Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.690719 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrg7g\" (UniqueName: \"kubernetes.io/projected/80416337-e9da-4316-8e19-60269a38f953-kube-api-access-jrg7g\") pod \"80416337-e9da-4316-8e19-60269a38f953\" (UID: \"80416337-e9da-4316-8e19-60269a38f953\") " Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.691711 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80416337-e9da-4316-8e19-60269a38f953-utilities" (OuterVolumeSpecName: "utilities") pod "80416337-e9da-4316-8e19-60269a38f953" (UID: "80416337-e9da-4316-8e19-60269a38f953"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.696290 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80416337-e9da-4316-8e19-60269a38f953-kube-api-access-jrg7g" (OuterVolumeSpecName: "kube-api-access-jrg7g") pod "80416337-e9da-4316-8e19-60269a38f953" (UID: "80416337-e9da-4316-8e19-60269a38f953"). InnerVolumeSpecName "kube-api-access-jrg7g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.730882 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80416337-e9da-4316-8e19-60269a38f953-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "80416337-e9da-4316-8e19-60269a38f953" (UID: "80416337-e9da-4316-8e19-60269a38f953"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.748535 5116 scope.go:117] "RemoveContainer" containerID="a45b77b25e96df84084740fb37f0eef39490e2085a93a6843097dcc84407e693" Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.775614 5116 scope.go:117] "RemoveContainer" containerID="74761d0d90a625a5e234b55e19cf91a539ebd7cec6dbfcc7e7db5eb028760967" Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.792472 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80416337-e9da-4316-8e19-60269a38f953-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.792510 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80416337-e9da-4316-8e19-60269a38f953-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.792522 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jrg7g\" (UniqueName: \"kubernetes.io/projected/80416337-e9da-4316-8e19-60269a38f953-kube-api-access-jrg7g\") on node \"crc\" DevicePath \"\"" Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.797019 5116 scope.go:117] "RemoveContainer" containerID="cc9c92696096c656dafa1bb5262bfb4753319fa68fab3c09fff4d408f1c0c86f" Mar 22 00:41:16 crc kubenswrapper[5116]: E0322 00:41:16.797695 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc9c92696096c656dafa1bb5262bfb4753319fa68fab3c09fff4d408f1c0c86f\": container with ID starting with cc9c92696096c656dafa1bb5262bfb4753319fa68fab3c09fff4d408f1c0c86f not found: ID does not exist" containerID="cc9c92696096c656dafa1bb5262bfb4753319fa68fab3c09fff4d408f1c0c86f" Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.797748 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc9c92696096c656dafa1bb5262bfb4753319fa68fab3c09fff4d408f1c0c86f"} err="failed to get container status \"cc9c92696096c656dafa1bb5262bfb4753319fa68fab3c09fff4d408f1c0c86f\": rpc error: code = NotFound desc = could not find container \"cc9c92696096c656dafa1bb5262bfb4753319fa68fab3c09fff4d408f1c0c86f\": container with ID starting with cc9c92696096c656dafa1bb5262bfb4753319fa68fab3c09fff4d408f1c0c86f not found: ID does not exist" Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.797768 5116 scope.go:117] "RemoveContainer" containerID="a45b77b25e96df84084740fb37f0eef39490e2085a93a6843097dcc84407e693" Mar 22 00:41:16 crc kubenswrapper[5116]: E0322 00:41:16.798098 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a45b77b25e96df84084740fb37f0eef39490e2085a93a6843097dcc84407e693\": container with ID starting with a45b77b25e96df84084740fb37f0eef39490e2085a93a6843097dcc84407e693 not found: ID does not exist" containerID="a45b77b25e96df84084740fb37f0eef39490e2085a93a6843097dcc84407e693" Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.798183 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a45b77b25e96df84084740fb37f0eef39490e2085a93a6843097dcc84407e693"} err="failed to get container status \"a45b77b25e96df84084740fb37f0eef39490e2085a93a6843097dcc84407e693\": rpc error: code = NotFound desc = could not find container \"a45b77b25e96df84084740fb37f0eef39490e2085a93a6843097dcc84407e693\": container with ID starting with a45b77b25e96df84084740fb37f0eef39490e2085a93a6843097dcc84407e693 not found: ID does not exist" Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.798215 5116 scope.go:117] "RemoveContainer" containerID="74761d0d90a625a5e234b55e19cf91a539ebd7cec6dbfcc7e7db5eb028760967" Mar 22 00:41:16 crc kubenswrapper[5116]: E0322 00:41:16.798516 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74761d0d90a625a5e234b55e19cf91a539ebd7cec6dbfcc7e7db5eb028760967\": container with ID starting with 74761d0d90a625a5e234b55e19cf91a539ebd7cec6dbfcc7e7db5eb028760967 not found: ID does not exist" containerID="74761d0d90a625a5e234b55e19cf91a539ebd7cec6dbfcc7e7db5eb028760967" Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.798580 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74761d0d90a625a5e234b55e19cf91a539ebd7cec6dbfcc7e7db5eb028760967"} err="failed to get container status \"74761d0d90a625a5e234b55e19cf91a539ebd7cec6dbfcc7e7db5eb028760967\": rpc error: code = NotFound desc = could not find container \"74761d0d90a625a5e234b55e19cf91a539ebd7cec6dbfcc7e7db5eb028760967\": container with ID starting with 74761d0d90a625a5e234b55e19cf91a539ebd7cec6dbfcc7e7db5eb028760967 not found: ID does not exist" Mar 22 00:41:17 crc kubenswrapper[5116]: I0322 00:41:17.031699 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h7bwj"] Mar 22 00:41:17 crc kubenswrapper[5116]: I0322 00:41:17.041152 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h7bwj"] Mar 22 00:41:17 crc kubenswrapper[5116]: I0322 00:41:17.711439 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80416337-e9da-4316-8e19-60269a38f953" path="/var/lib/kubelet/pods/80416337-e9da-4316-8e19-60269a38f953/volumes" Mar 22 00:41:23 crc kubenswrapper[5116]: I0322 00:41:23.714837 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hmpvb" Mar 22 00:41:23 crc kubenswrapper[5116]: I0322 00:41:23.787598 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hmpvb" Mar 22 00:41:23 crc kubenswrapper[5116]: I0322 00:41:23.958803 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hmpvb"] Mar 22 00:41:24 crc kubenswrapper[5116]: I0322 00:41:24.785838 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hmpvb" podUID="bb7db85b-fa66-42db-bf71-1fe9b38656d0" containerName="registry-server" containerID="cri-o://a52f0f85ecfca25587ee4e799f45a417064887df99c53023b059ef8de18c69ec" gracePeriod=2 Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.248191 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hmpvb" Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.338764 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb7db85b-fa66-42db-bf71-1fe9b38656d0-catalog-content\") pod \"bb7db85b-fa66-42db-bf71-1fe9b38656d0\" (UID: \"bb7db85b-fa66-42db-bf71-1fe9b38656d0\") " Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.338982 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb7db85b-fa66-42db-bf71-1fe9b38656d0-utilities\") pod \"bb7db85b-fa66-42db-bf71-1fe9b38656d0\" (UID: \"bb7db85b-fa66-42db-bf71-1fe9b38656d0\") " Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.339045 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqw7l\" (UniqueName: \"kubernetes.io/projected/bb7db85b-fa66-42db-bf71-1fe9b38656d0-kube-api-access-bqw7l\") pod \"bb7db85b-fa66-42db-bf71-1fe9b38656d0\" (UID: \"bb7db85b-fa66-42db-bf71-1fe9b38656d0\") " Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.341087 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb7db85b-fa66-42db-bf71-1fe9b38656d0-utilities" (OuterVolumeSpecName: "utilities") pod "bb7db85b-fa66-42db-bf71-1fe9b38656d0" (UID: "bb7db85b-fa66-42db-bf71-1fe9b38656d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.348542 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb7db85b-fa66-42db-bf71-1fe9b38656d0-kube-api-access-bqw7l" (OuterVolumeSpecName: "kube-api-access-bqw7l") pod "bb7db85b-fa66-42db-bf71-1fe9b38656d0" (UID: "bb7db85b-fa66-42db-bf71-1fe9b38656d0"). InnerVolumeSpecName "kube-api-access-bqw7l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.440483 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb7db85b-fa66-42db-bf71-1fe9b38656d0-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.440521 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bqw7l\" (UniqueName: \"kubernetes.io/projected/bb7db85b-fa66-42db-bf71-1fe9b38656d0-kube-api-access-bqw7l\") on node \"crc\" DevicePath \"\"" Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.489905 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb7db85b-fa66-42db-bf71-1fe9b38656d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb7db85b-fa66-42db-bf71-1fe9b38656d0" (UID: "bb7db85b-fa66-42db-bf71-1fe9b38656d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.542442 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb7db85b-fa66-42db-bf71-1fe9b38656d0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.800020 5116 generic.go:358] "Generic (PLEG): container finished" podID="bb7db85b-fa66-42db-bf71-1fe9b38656d0" containerID="a52f0f85ecfca25587ee4e799f45a417064887df99c53023b059ef8de18c69ec" exitCode=0 Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.800406 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmpvb" event={"ID":"bb7db85b-fa66-42db-bf71-1fe9b38656d0","Type":"ContainerDied","Data":"a52f0f85ecfca25587ee4e799f45a417064887df99c53023b059ef8de18c69ec"} Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.800447 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmpvb" event={"ID":"bb7db85b-fa66-42db-bf71-1fe9b38656d0","Type":"ContainerDied","Data":"c706029a199b983363e08ea2f49456a42ed0bfa1c14b04db0528466c7004cb3d"} Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.800479 5116 scope.go:117] "RemoveContainer" containerID="a52f0f85ecfca25587ee4e799f45a417064887df99c53023b059ef8de18c69ec" Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.800759 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hmpvb" Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.830498 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hmpvb"] Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.832335 5116 scope.go:117] "RemoveContainer" containerID="1437f50b92cfa38489e0ac8d4e9751b959af8284a6f09cd644fac44f01cbc172" Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.841599 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hmpvb"] Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.854893 5116 scope.go:117] "RemoveContainer" containerID="9d697664ac6e1358387eb796015fa0631e395dd7a2b335ec3240fa86e48761a2" Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.883723 5116 scope.go:117] "RemoveContainer" containerID="a52f0f85ecfca25587ee4e799f45a417064887df99c53023b059ef8de18c69ec" Mar 22 00:41:25 crc kubenswrapper[5116]: E0322 00:41:25.884293 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a52f0f85ecfca25587ee4e799f45a417064887df99c53023b059ef8de18c69ec\": container with ID starting with a52f0f85ecfca25587ee4e799f45a417064887df99c53023b059ef8de18c69ec not found: ID does not exist" containerID="a52f0f85ecfca25587ee4e799f45a417064887df99c53023b059ef8de18c69ec" Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.884364 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a52f0f85ecfca25587ee4e799f45a417064887df99c53023b059ef8de18c69ec"} err="failed to get container status \"a52f0f85ecfca25587ee4e799f45a417064887df99c53023b059ef8de18c69ec\": rpc error: code = NotFound desc = could not find container \"a52f0f85ecfca25587ee4e799f45a417064887df99c53023b059ef8de18c69ec\": container with ID starting with a52f0f85ecfca25587ee4e799f45a417064887df99c53023b059ef8de18c69ec not found: ID does not exist" Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.884404 5116 scope.go:117] "RemoveContainer" containerID="1437f50b92cfa38489e0ac8d4e9751b959af8284a6f09cd644fac44f01cbc172" Mar 22 00:41:25 crc kubenswrapper[5116]: E0322 00:41:25.884916 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1437f50b92cfa38489e0ac8d4e9751b959af8284a6f09cd644fac44f01cbc172\": container with ID starting with 1437f50b92cfa38489e0ac8d4e9751b959af8284a6f09cd644fac44f01cbc172 not found: ID does not exist" containerID="1437f50b92cfa38489e0ac8d4e9751b959af8284a6f09cd644fac44f01cbc172" Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.884958 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1437f50b92cfa38489e0ac8d4e9751b959af8284a6f09cd644fac44f01cbc172"} err="failed to get container status \"1437f50b92cfa38489e0ac8d4e9751b959af8284a6f09cd644fac44f01cbc172\": rpc error: code = NotFound desc = could not find container \"1437f50b92cfa38489e0ac8d4e9751b959af8284a6f09cd644fac44f01cbc172\": container with ID starting with 1437f50b92cfa38489e0ac8d4e9751b959af8284a6f09cd644fac44f01cbc172 not found: ID does not exist" Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.884983 5116 scope.go:117] "RemoveContainer" containerID="9d697664ac6e1358387eb796015fa0631e395dd7a2b335ec3240fa86e48761a2" Mar 22 00:41:25 crc kubenswrapper[5116]: E0322 00:41:25.885561 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d697664ac6e1358387eb796015fa0631e395dd7a2b335ec3240fa86e48761a2\": container with ID starting with 9d697664ac6e1358387eb796015fa0631e395dd7a2b335ec3240fa86e48761a2 not found: ID does not exist" containerID="9d697664ac6e1358387eb796015fa0631e395dd7a2b335ec3240fa86e48761a2" Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.885617 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d697664ac6e1358387eb796015fa0631e395dd7a2b335ec3240fa86e48761a2"} err="failed to get container status \"9d697664ac6e1358387eb796015fa0631e395dd7a2b335ec3240fa86e48761a2\": rpc error: code = NotFound desc = could not find container \"9d697664ac6e1358387eb796015fa0631e395dd7a2b335ec3240fa86e48761a2\": container with ID starting with 9d697664ac6e1358387eb796015fa0631e395dd7a2b335ec3240fa86e48761a2 not found: ID does not exist" Mar 22 00:41:27 crc kubenswrapper[5116]: I0322 00:41:27.713008 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb7db85b-fa66-42db-bf71-1fe9b38656d0" path="/var/lib/kubelet/pods/bb7db85b-fa66-42db-bf71-1fe9b38656d0/volumes" Mar 22 00:41:53 crc kubenswrapper[5116]: I0322 00:41:53.056578 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:41:53 crc kubenswrapper[5116]: I0322 00:41:53.057263 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:42:00 crc kubenswrapper[5116]: I0322 00:42:00.161671 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29569002-q66bp"] Mar 22 00:42:00 crc kubenswrapper[5116]: I0322 00:42:00.164139 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb7db85b-fa66-42db-bf71-1fe9b38656d0" containerName="extract-utilities" Mar 22 00:42:00 crc kubenswrapper[5116]: I0322 00:42:00.164197 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb7db85b-fa66-42db-bf71-1fe9b38656d0" containerName="extract-utilities" Mar 22 00:42:00 crc kubenswrapper[5116]: I0322 00:42:00.164257 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80416337-e9da-4316-8e19-60269a38f953" containerName="registry-server" Mar 22 00:42:00 crc kubenswrapper[5116]: I0322 00:42:00.164270 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="80416337-e9da-4316-8e19-60269a38f953" containerName="registry-server" Mar 22 00:42:00 crc kubenswrapper[5116]: I0322 00:42:00.164288 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb7db85b-fa66-42db-bf71-1fe9b38656d0" containerName="registry-server" Mar 22 00:42:00 crc kubenswrapper[5116]: I0322 00:42:00.164300 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb7db85b-fa66-42db-bf71-1fe9b38656d0" containerName="registry-server" Mar 22 00:42:00 crc kubenswrapper[5116]: I0322 00:42:00.164323 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80416337-e9da-4316-8e19-60269a38f953" containerName="extract-utilities" Mar 22 00:42:00 crc kubenswrapper[5116]: I0322 00:42:00.164336 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="80416337-e9da-4316-8e19-60269a38f953" containerName="extract-utilities" Mar 22 00:42:00 crc kubenswrapper[5116]: I0322 00:42:00.164351 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb7db85b-fa66-42db-bf71-1fe9b38656d0" containerName="extract-content" Mar 22 00:42:00 crc kubenswrapper[5116]: I0322 00:42:00.164362 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb7db85b-fa66-42db-bf71-1fe9b38656d0" containerName="extract-content" Mar 22 00:42:00 crc kubenswrapper[5116]: I0322 00:42:00.164406 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80416337-e9da-4316-8e19-60269a38f953" containerName="extract-content" Mar 22 00:42:00 crc kubenswrapper[5116]: I0322 00:42:00.164418 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="80416337-e9da-4316-8e19-60269a38f953" containerName="extract-content" Mar 22 00:42:00 crc kubenswrapper[5116]: I0322 00:42:00.164725 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="80416337-e9da-4316-8e19-60269a38f953" containerName="registry-server" Mar 22 00:42:00 crc kubenswrapper[5116]: I0322 00:42:00.164752 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb7db85b-fa66-42db-bf71-1fe9b38656d0" containerName="registry-server" Mar 22 00:42:00 crc kubenswrapper[5116]: I0322 00:42:00.174789 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29569002-q66bp" Mar 22 00:42:00 crc kubenswrapper[5116]: I0322 00:42:00.176876 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29569002-q66bp"] Mar 22 00:42:00 crc kubenswrapper[5116]: I0322 00:42:00.178537 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 22 00:42:00 crc kubenswrapper[5116]: I0322 00:42:00.178546 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 22 00:42:00 crc kubenswrapper[5116]: I0322 00:42:00.179634 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-zsw2q\"" Mar 22 00:42:00 crc kubenswrapper[5116]: I0322 00:42:00.307333 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ctpm\" (UniqueName: \"kubernetes.io/projected/46bbf206-92ba-49e3-883d-9a89ea3bb3ee-kube-api-access-5ctpm\") pod \"auto-csr-approver-29569002-q66bp\" (UID: \"46bbf206-92ba-49e3-883d-9a89ea3bb3ee\") " pod="openshift-infra/auto-csr-approver-29569002-q66bp" Mar 22 00:42:00 crc kubenswrapper[5116]: I0322 00:42:00.408711 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ctpm\" (UniqueName: \"kubernetes.io/projected/46bbf206-92ba-49e3-883d-9a89ea3bb3ee-kube-api-access-5ctpm\") pod \"auto-csr-approver-29569002-q66bp\" (UID: \"46bbf206-92ba-49e3-883d-9a89ea3bb3ee\") " pod="openshift-infra/auto-csr-approver-29569002-q66bp" Mar 22 00:42:00 crc kubenswrapper[5116]: I0322 00:42:00.437389 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ctpm\" (UniqueName: \"kubernetes.io/projected/46bbf206-92ba-49e3-883d-9a89ea3bb3ee-kube-api-access-5ctpm\") pod \"auto-csr-approver-29569002-q66bp\" (UID: \"46bbf206-92ba-49e3-883d-9a89ea3bb3ee\") " pod="openshift-infra/auto-csr-approver-29569002-q66bp" Mar 22 00:42:00 crc kubenswrapper[5116]: I0322 00:42:00.511655 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29569002-q66bp" Mar 22 00:42:00 crc kubenswrapper[5116]: I0322 00:42:00.805159 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29569002-q66bp"] Mar 22 00:42:01 crc kubenswrapper[5116]: I0322 00:42:01.162423 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29569002-q66bp" event={"ID":"46bbf206-92ba-49e3-883d-9a89ea3bb3ee","Type":"ContainerStarted","Data":"39efa95f4b57399a7429a2550fa4791af0a06cb0152b0132ce213f724f654c9c"} Mar 22 00:42:03 crc kubenswrapper[5116]: I0322 00:42:03.184652 5116 generic.go:358] "Generic (PLEG): container finished" podID="46bbf206-92ba-49e3-883d-9a89ea3bb3ee" containerID="d8bc60768616e41bd8473ca01cfa1660bc6ac14b949812265c8347bc37ca4e60" exitCode=0 Mar 22 00:42:03 crc kubenswrapper[5116]: I0322 00:42:03.184818 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29569002-q66bp" event={"ID":"46bbf206-92ba-49e3-883d-9a89ea3bb3ee","Type":"ContainerDied","Data":"d8bc60768616e41bd8473ca01cfa1660bc6ac14b949812265c8347bc37ca4e60"} Mar 22 00:42:04 crc kubenswrapper[5116]: I0322 00:42:04.548675 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29569002-q66bp" Mar 22 00:42:04 crc kubenswrapper[5116]: I0322 00:42:04.689491 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ctpm\" (UniqueName: \"kubernetes.io/projected/46bbf206-92ba-49e3-883d-9a89ea3bb3ee-kube-api-access-5ctpm\") pod \"46bbf206-92ba-49e3-883d-9a89ea3bb3ee\" (UID: \"46bbf206-92ba-49e3-883d-9a89ea3bb3ee\") " Mar 22 00:42:04 crc kubenswrapper[5116]: I0322 00:42:04.700778 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46bbf206-92ba-49e3-883d-9a89ea3bb3ee-kube-api-access-5ctpm" (OuterVolumeSpecName: "kube-api-access-5ctpm") pod "46bbf206-92ba-49e3-883d-9a89ea3bb3ee" (UID: "46bbf206-92ba-49e3-883d-9a89ea3bb3ee"). InnerVolumeSpecName "kube-api-access-5ctpm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:42:04 crc kubenswrapper[5116]: I0322 00:42:04.796641 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5ctpm\" (UniqueName: \"kubernetes.io/projected/46bbf206-92ba-49e3-883d-9a89ea3bb3ee-kube-api-access-5ctpm\") on node \"crc\" DevicePath \"\"" Mar 22 00:42:05 crc kubenswrapper[5116]: I0322 00:42:05.207239 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29569002-q66bp" event={"ID":"46bbf206-92ba-49e3-883d-9a89ea3bb3ee","Type":"ContainerDied","Data":"39efa95f4b57399a7429a2550fa4791af0a06cb0152b0132ce213f724f654c9c"} Mar 22 00:42:05 crc kubenswrapper[5116]: I0322 00:42:05.207677 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39efa95f4b57399a7429a2550fa4791af0a06cb0152b0132ce213f724f654c9c" Mar 22 00:42:05 crc kubenswrapper[5116]: I0322 00:42:05.207368 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29569002-q66bp" Mar 22 00:42:05 crc kubenswrapper[5116]: I0322 00:42:05.646415 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29568996-kb9ll"] Mar 22 00:42:05 crc kubenswrapper[5116]: I0322 00:42:05.658623 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29568996-kb9ll"] Mar 22 00:42:05 crc kubenswrapper[5116]: I0322 00:42:05.713274 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a" path="/var/lib/kubelet/pods/0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a/volumes" Mar 22 00:42:23 crc kubenswrapper[5116]: I0322 00:42:23.056747 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:42:23 crc kubenswrapper[5116]: I0322 00:42:23.057420 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:42:53 crc kubenswrapper[5116]: I0322 00:42:53.057819 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:42:53 crc kubenswrapper[5116]: I0322 00:42:53.058444 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:42:53 crc kubenswrapper[5116]: I0322 00:42:53.058500 5116 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:42:53 crc kubenswrapper[5116]: I0322 00:42:53.059118 5116 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"495f94902224bb639caa02902e55778bab7314c3bab4e573936b06f73f196f83"} pod="openshift-machine-config-operator/machine-config-daemon-66g6d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 22 00:42:53 crc kubenswrapper[5116]: I0322 00:42:53.059209 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" containerID="cri-o://495f94902224bb639caa02902e55778bab7314c3bab4e573936b06f73f196f83" gracePeriod=600 Mar 22 00:42:53 crc kubenswrapper[5116]: I0322 00:42:53.694471 5116 generic.go:358] "Generic (PLEG): container finished" podID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerID="495f94902224bb639caa02902e55778bab7314c3bab4e573936b06f73f196f83" exitCode=0 Mar 22 00:42:53 crc kubenswrapper[5116]: I0322 00:42:53.694539 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerDied","Data":"495f94902224bb639caa02902e55778bab7314c3bab4e573936b06f73f196f83"} Mar 22 00:42:53 crc kubenswrapper[5116]: I0322 00:42:53.694959 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerStarted","Data":"f68e786864d5003fd16d3d2b7bc31d9b3c8c2ed02dd4ee3e18fa12747959ccfe"} Mar 22 00:42:53 crc kubenswrapper[5116]: I0322 00:42:53.695002 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:42:58 crc kubenswrapper[5116]: I0322 00:42:58.735866 5116 scope.go:117] "RemoveContainer" containerID="688a5bb9aa102b8d7fbc699af35c11b9bcfdddc98bc6964ddb5e59d34be83553" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515157635456024466 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015157635456017403 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015157630776016526 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015157630776015476 5ustar corecore